Merge branch 'develop' into 'master'

v0.3.0

See merge request slumber/multi-user!106
This commit is contained in:
Swann Martinez 2021-04-14 14:32:24 +00:00
commit a7e9108bff
50 changed files with 1493 additions and 842 deletions

View File

@ -2,9 +2,12 @@ stages:
- test - test
- build - build
- deploy - deploy
- doc
include: include:
- local: .gitlab/ci/test.gitlab-ci.yml - local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml - local: .gitlab/ci/build.gitlab-ci.yml
- local: .gitlab/ci/deploy.gitlab-ci.yml - local: .gitlab/ci/deploy.gitlab-ci.yml
- local: .gitlab/ci/doc.gitlab-ci.yml

View File

@ -1,5 +1,6 @@
build: build:
stage: build stage: build
needs: ["test"]
image: debian:stable-slim image: debian:stable-slim
script: script:
- rm -rf tests .git .gitignore script - rm -rf tests .git .gitignore script
@ -7,7 +8,3 @@ build:
name: multi_user name: multi_user
paths: paths:
- multi_user - multi_user
only:
refs:
- master
- develop

View File

@ -1,5 +1,6 @@
deploy: deploy:
stage: deploy stage: deploy
needs: ["build"]
image: slumber/docker-python image: slumber/docker-python
variables: variables:
DOCKER_DRIVER: overlay2 DOCKER_DRIVER: overlay2
@ -15,10 +16,5 @@ deploy:
- docker build --build-arg replication_version=${RP_VERSION} --build-arg version={VERSION} -t registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} ./scripts/docker_server - docker build --build-arg replication_version=${RP_VERSION} --build-arg version={VERSION} -t registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} ./scripts/docker_server
- echo "Pushing to gitlab registry ${VERSION}" - echo "Pushing to gitlab registry ${VERSION}"
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker tag registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} registry.gitlab.com/slumber/multi-user/multi-user-server:latest - docker tag registry.gitlab.com/slumber/multi-user/multi-user-server:${VERSION} registry.gitlab.com/slumber/multi-user/multi-user-server:${CI_COMMIT_REF_NAME}
- docker push registry.gitlab.com/slumber/multi-user/multi-user-server - docker push registry.gitlab.com/slumber/multi-user/multi-user-server
only:
refs:
- master
- develop

View File

@ -0,0 +1,16 @@
pages:
stage: doc
needs: ["deploy"]
image: python
script:
- pip install -U sphinx sphinx_rtd_theme sphinx-material
- sphinx-build -b html ./docs public
artifacts:
paths:
- public
only:
refs:
- master
- develop

View File

@ -158,3 +158,32 @@ All notable changes to this project will be documented in this file.
- Material renaming - Material renaming
- Default material nodes input parameters - Default material nodes input parameters
- blender 2.91 python api compatibility - blender 2.91 python api compatibility
## [0.3.0] - 2021-04-14
### Added
- Curve material support
- Cycle visibility settings
- Session save/load operator
- Add new scene support
- Physic initial support
- Geometry node initial support
- Blender 2.93 compatibility
### Changed
- Host documentation on Gitlab Page
- Event driven update (from the blender deps graph)
### Fixed
- Vertex group assignation
- Parent relation can't be removed
- Separate object
- Delete animation
- Sync missing holdout option for grease pencil material
- Sync missing `skin_vertices`
- Exception access violation during Undo/Redo
- Sync missing armature bone Roll
- Sync missing driver data_path
- Constraint replication

View File

@ -19,31 +19,32 @@ This tool aims to allow multiple users to work on the same scene over the networ
## Usage ## Usage
See the [documentation](https://multi-user.readthedocs.io/en/latest/) for details. See the [documentation](https://slumber.gitlab.io/multi-user/index.html) for details.
## Troubleshooting ## Troubleshooting
See the [troubleshooting guide](https://multi-user.readthedocs.io/en/latest/getting_started/troubleshooting.html) for tips on the most common issues. See the [troubleshooting guide](https://slumber.gitlab.io/multi-user/getting_started/troubleshooting.html) for tips on the most common issues.
## Current development status ## Current development status
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones. Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
| Name | Status | Comment | | Name | Status | Comment |
| ----------- | :----: | :--------------------------------------------------------------------------: | | -------------- | :----: | :----------------------------------------------------------: |
| action | ✔️ | | | action | ✔️ | |
| armature | ❗ | Not stable | | armature | ❗ | Not stable |
| camera | ✔️ | | | camera | ✔️ | |
| collection | ✔️ | | | collection | ✔️ | |
| curve | ❗ | Nurbs not supported | | curve | ❗ | Nurbs surfaces not supported |
| gpencil | ✔️ | [Airbrush not supported](https://gitlab.com/slumber/multi-user/-/issues/123) | | gpencil | ✔️ | |
| image | ✔️ | | | image | ✔️ | |
| mesh | ✔️ | | | mesh | ✔️ | |
| material | ✔️ | | | material | ✔️ | |
| node_groups | ❗ | Material only | | node_groups | ❗ | Material & Geometry only |
| geometry nodes | ✔️ | |
| metaball | ✔️ | | | metaball | ✔️ | |
| object | ✔️ | | | object | ✔️ | |
| textures | ❗ | Supported for modifiers only | | textures | ❗ | Supported for modifiers/materials/geo nodes only |
| texts | ✔️ | | | texts | ✔️ | |
| scene | ✔️ | | | scene | ✔️ | |
| world | ✔️ | | | world | ✔️ | |
@ -52,13 +53,14 @@ Currently, not all data-block are supported for replication over the wire. The f
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) | | texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | ❌ | | | nla | ❌ | |
| volumes | ✔️ | | | volumes | ✔️ | |
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) | | particles | ❗ | The cache isn't syncing. |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) | | speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❗ | Mask and Clip not supported yet | | vse | ❗ | Mask and Clip not supported yet |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) | | physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial | | libraries | ❗ | Partial |
### Performance issues ### Performance issues
Since this addon is written in pure python for a research purpose, performances could be better from all perspective. Since this addon is written in pure python for a research purpose, performances could be better from all perspective.
@ -74,7 +76,7 @@ I'm working on it.
## Contributing ## Contributing
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation. See [contributing section](https://slumber.gitlab.io/multi-user/ways_to_contribute.html) of the documentation.
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute. Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 106 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 80 KiB

View File

@ -292,7 +292,7 @@ a connected user or be under :ref:`common-right<**COMMON**>` rights.
The Repository panel (see image below) allows you to monitor, change datablock states and rights manually. The Repository panel (see image below) allows you to monitor, change datablock states and rights manually.
.. figure:: img/quickstart_properties.png .. figure:: img/quickstart_save_session_data.png
:align: center :align: center
Repository panel Repository panel
@ -319,6 +319,40 @@ Here is a quick list of available actions:
| .. image:: img/quickstart_remove.png | **Delete** | Remove the data-block from network replication | | .. image:: img/quickstart_remove.png | **Delete** | Remove the data-block from network replication |
+---------------------------------------+-------------------+------------------------------------------------------------------------------------+ +---------------------------------------+-------------------+------------------------------------------------------------------------------------+
Save session data
-----------------
.. danger::
This is an experimental feature, until the stable release it is highly recommended to use regular .blend save.
The save session data allows you to create a backup of the session data.
When you hit the **save session data** button, the following popup dialog will appear.
It allows you to choose the destination folder and if you want to run an auto-save.
.. figure:: img/quickstart_save_session_data_dialog.png
:align: center
Save session data dialog.
If you enabled the auto-save option, you can cancel it from the **Cancel auto-save** button.
.. figure:: img/quickstart_save_session_data_cancel.png
:align: center
Cancel session autosave.
To import session data backups, use the following **Multiuser session snapshot** import dialog
.. figure:: img/quickstart_import_session_data.png
:align: center
Import session data dialog.
.. note::
It is not yet possible to start a session directly from a backup.
.. _advanced: .. _advanced:
Advanced settings Advanced settings

View File

@ -258,33 +258,55 @@ You can check that your container is running, and find its ID and name with:
.. _docker-logs: .. _docker-logs:
Viewing logs in a docker container Viewing logs in a docker container
----------------------------------- ----------------------------------
Logs for the server running in a docker container can be accessed by outputting the following to a log file: Logs for the server running in a docker container can be accessed by outputting the container logs to a log file. First, you'll need to know your container ID, which you can find by running:
.. code-block:: bash
docker log your-container-id >& dockerserver.log
.. Note:: If using WSL2 on Windows 10 (Windows Subsystem for Linux), it is preferable to run a dedicated server via regular command line approach (or the associated startup script) from within Windows - docker desktop for windows 10 usually uses the WSL2 backend where it is available.
.. This may not be true. Need to write up how to locally start a docker container from WSL2
First, you'll need to know your container ID, which you can find by running:
.. code-block:: bash .. code-block:: bash
docker ps docker ps
If you're cloud-hosting with e.g. Google Cloud, your container will be the one associated with the `registry address <https://gitlab.com/slumber/multi-user/container_registry/1174180>`_ where your Docker image was located. e.g. registry.gitlab.com/slumber/multi-user/multi-user-server:0.2.0 Then, output the container logs to a file:
You can either ssh in to your server and then run
.. code-block:: bash .. code-block:: bash
cat your-log-name.log docker logs your-container-id >& dockerserver.log
or view the docker container logs with .. Note:: If using WSL2 on Windows 10 (Windows Subsystem for Linux), it is preferable to run a dedicated server via regular command line approach (or the associated startup script) from within Windows - docker desktop for windows 10 usually uses the WSL2 backend where it is available.
.. This may not be true. Need to write up how to locally start a docker container from WSL2
Downloading logs from a docker container on a cloud-hosted server
-----------------------------------------------------------------
If you'd like to pull the log files from a cloud-hosted server to submit to a developer for review, a simple process using SSH and SCP is as follows:
First SSH into your instance. You can either open the `VM Instances console <https://console.cloud.google.com/compute/instances>`_ and use the browser terminal provided by Google Cloud (I had the best luck using the Google Chrome browser)... or you can see `here <https://cloud.google.com/compute/docs/instances/connecting-advanced#thirdpartytools>`_ for how to set up your instance for SSH access from your local terminal.
If using SSH from your terminal, first generate SSH keys (setting their access permissions to e.g. chmod 400 level whereby only the user has permissions) and submit the public key to the cloud-hosted VM instance, storing the private key on your local machine.
Then, SSH into your cloud server from your local terminal, with the following command:
.. code-block:: bash
ssh -i PATH_TO_PRIVATE_KEY USERNAME@EXTERNAL_IP_ADDRESS
Use the private key which corresponds to the public key you uploaded, and the username associated with that key (visible in the Google Cloud console for your VM Instance). Use the external IP address for the server, available from the `VM Instances console <https://console.cloud.google.com/compute/instances>`_
e.g.
.. code-block:: bash
ssh -i ~/.ssh/id_rsa user@xxx.xxx.xxx.xxx
Once you've connected to the server's secure shell, you can generate a log file from the docker container running the replication server. First, you'll need to know your container ID, which you can find by running:
.. code-block:: bash
docker ps
If you're cloud-hosting with e.g. Google Cloud, your container will be the one associated with the `registry address <https://gitlab.com/slumber/multi-user/container_registry/1174180>`_ where your Docker image was located. e.g. registry.gitlab.com/slumber/multi-user/multi-user-server:latest
To view the docker container logs, run:
.. code-block:: bash .. code-block:: bash
@ -296,7 +318,29 @@ OR
docker logs your-container-id docker logs your-container-id
Note, see these `notes <https://cloud.google.com/compute/docs/containers/deploying-containers?_ga=2.113663175.-1396941296.1606125558#viewing_container_logs>`_ for how to check server logs on Google Cloud. To save the output to a file, run:
.. code-block:: bash
docker logs your-container-id >& dockerserver.log
Now that the server logs are available in a file, we can disconnect from the secure shell (SSH), and then copy the file to the local machine using SCP. In your local terminal, execute the following:
.. code-block:: bash
scp -i PATH_TO_PRIVATE_KEY USERNAME@EXTERNAL_IP_ADDRESS:"dockerserver.log" LOCAL_PATH_TO_COPY_FILE_TO
e.g.
.. code-block:: bash
scp -i ~/.ssh/id_rsa user@xxx.xxx.xxx.xxx:"dockerserver.log" .
This copies the file dockerserver.log generated in the previous step to the current directory on the local machine. From there, you can send it to the multi-user maintainers for review.
.. Note:: See these `notes <https://cloud.google.com/compute/docs/containers/deploying-containers?_ga=2.113663175.-1396941296.1606125558#viewing_container_logs>`_ for how to check server logs on Google Cloud using other tools.
.. _serverstartscripts: .. _serverstartscripts:

View File

@ -19,7 +19,7 @@
bl_info = { bl_info = {
"name": "Multi-User", "name": "Multi-User",
"author": "Swann Martinez", "author": "Swann Martinez",
"version": (0, 2, 0), "version": (0, 3, 0),
"description": "Enable real-time collaborative workflow inside blender", "description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 82, 0), "blender": (2, 82, 0),
"location": "3D View > Sidebar > Multi-User tab", "location": "3D View > Sidebar > Multi-User tab",
@ -44,7 +44,7 @@ from . import environment
DEPENDENCIES = { DEPENDENCIES = {
("replication", '0.1.17'), ("replication", '0.1.26'),
} }
@ -89,6 +89,8 @@ def register():
type=preferences.SessionUser type=preferences.SessionUser
) )
bpy.types.WindowManager.user_index = bpy.props.IntProperty() bpy.types.WindowManager.user_index = bpy.props.IntProperty()
bpy.types.TOPBAR_MT_file_import.append(operators.menu_func_import)
def unregister(): def unregister():
from . import presence from . import presence
@ -97,6 +99,8 @@ def unregister():
from . import preferences from . import preferences
from . import addon_updater_ops from . import addon_updater_ops
bpy.types.TOPBAR_MT_file_import.remove(operators.menu_func_import)
presence.unregister() presence.unregister()
addon_updater_ops.unregister() addon_updater_ops.unregister()
ui.unregister() ui.unregister()

View File

@ -39,9 +39,10 @@ __all__ = [
'bl_font', 'bl_font',
'bl_sound', 'bl_sound',
'bl_file', 'bl_file',
'bl_sequencer', # 'bl_sequencer',
'bl_node_group', 'bl_node_group',
'bl_texture', 'bl_texture',
"bl_particle",
] # Order here defines execution order ] # Order here defines execution order
if bpy.app.version[1] >= 91: if bpy.app.version[1] >= 91:

View File

@ -132,9 +132,6 @@ def load_fcurve(fcurve_data, fcurve):
class BlAction(BlDatablock): class BlAction(BlDatablock):
bl_id = "actions" bl_id = "actions"
bl_class = bpy.types.Action bl_class = bpy.types.Action
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'ACTION_TWEAK' bl_icon = 'ACTION_TWEAK'
bl_reload_parent = False bl_reload_parent = False

View File

@ -25,12 +25,19 @@ from .. import presence, operators, utils
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
def get_roll(bone: bpy.types.Bone) -> float:
""" Compute the actuall roll of a pose bone
:arg pose_bone: target pose bone
:type pose_bone: bpy.types.PoseBone
:return: float
"""
return bone.AxisRollFromMatrix(bone.matrix_local.to_3x3())[1]
class BlArmature(BlDatablock): class BlArmature(BlDatablock):
bl_id = "armatures" bl_id = "armatures"
bl_class = bpy.types.Armature bl_class = bpy.types.Armature
bl_delay_refresh = 1
bl_delay_apply = 0
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'ARMATURE_DATA' bl_icon = 'ARMATURE_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -94,7 +101,7 @@ class BlArmature(BlDatablock):
new_bone.head = bone_data['head_local'] new_bone.head = bone_data['head_local']
new_bone.tail_radius = bone_data['tail_radius'] new_bone.tail_radius = bone_data['tail_radius']
new_bone.head_radius = bone_data['head_radius'] new_bone.head_radius = bone_data['head_radius']
# new_bone.roll = bone_data['roll'] new_bone.roll = bone_data['roll']
if 'parent' in bone_data: if 'parent' in bone_data:
new_bone.parent = target.edit_bones[data['bones'] new_bone.parent = target.edit_bones[data['bones']
@ -127,8 +134,6 @@ class BlArmature(BlDatablock):
'parent', 'parent',
'name', 'name',
'layers', 'layers',
# 'roll',
] ]
data = dumper.dump(instance) data = dumper.dump(instance)
@ -136,6 +141,7 @@ class BlArmature(BlDatablock):
if bone.parent: if bone.parent:
data['bones'][bone.name]['parent'] = bone.parent.name data['bones'][bone.name]['parent'] = bone.parent.name
# get the parent Object # get the parent Object
# TODO: Use id_data instead
object_users = utils.get_datablock_users(instance)[0] object_users = utils.get_datablock_users(instance)[0]
data['user'] = object_users.uuid data['user'] = object_users.uuid
data['user_name'] = object_users.name data['user_name'] = object_users.name
@ -146,6 +152,8 @@ class BlArmature(BlDatablock):
item.name for item in container_users if isinstance(item, bpy.types.Collection)] item.name for item in container_users if isinstance(item, bpy.types.Collection)]
data['user_scene'] = [ data['user_scene'] = [
item.name for item in container_users if isinstance(item, bpy.types.Scene)] item.name for item in container_users if isinstance(item, bpy.types.Scene)]
for bone in instance.bones:
data['bones'][bone.name]['roll'] = get_roll(bone)
return data return data

View File

@ -26,9 +26,6 @@ from .bl_datablock import BlDatablock
class BlCamera(BlDatablock): class BlCamera(BlDatablock):
bl_id = "cameras" bl_id = "cameras"
bl_class = bpy.types.Camera bl_class = bpy.types.Camera
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'CAMERA_DATA' bl_icon = 'CAMERA_DATA'
bl_reload_parent = False bl_reload_parent = False

View File

@ -85,9 +85,6 @@ class BlCollection(BlDatablock):
bl_id = "collections" bl_id = "collections"
bl_icon = 'FILE_FOLDER' bl_icon = 'FILE_FOLDER'
bl_class = bpy.types.Collection bl_class = bpy.types.Collection
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True bl_check_common = True
bl_reload_parent = False bl_reload_parent = False
@ -114,6 +111,10 @@ class BlCollection(BlDatablock):
# Link childrens # Link childrens
load_collection_childrens(data['children'], target) load_collection_childrens(data['children'], target)
# FIXME: Find a better way after the replication big refacotoring
# Keep other user from deleting collection object by flushing their history
utils.flush_history()
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)

View File

@ -26,7 +26,8 @@ from .bl_datablock import BlDatablock
from .dump_anything import (Dumper, Loader, from .dump_anything import (Dumper, Loader,
np_load_collection, np_load_collection,
np_dump_collection) np_dump_collection)
from .bl_datablock import get_datablock_from_uuid
from .bl_material import dump_materials_slots, load_materials_slots
SPLINE_BEZIER_POINT = [ SPLINE_BEZIER_POINT = [
# "handle_left_type", # "handle_left_type",
@ -68,8 +69,6 @@ CURVE_METADATA = [
'font_bold', 'font_bold',
'font_bold_italic', 'font_bold_italic',
'font_italic', 'font_italic',
'make_local',
'materials',
'name', 'name',
'offset', 'offset',
'offset_x', 'offset_x',
@ -79,7 +78,6 @@ CURVE_METADATA = [
'override_create', 'override_create',
'override_library', 'override_library',
'path_duration', 'path_duration',
'preview',
'render_resolution_u', 'render_resolution_u',
'render_resolution_v', 'render_resolution_v',
'resolution_u', 'resolution_u',
@ -113,8 +111,6 @@ CURVE_METADATA = [
] ]
SPLINE_METADATA = [ SPLINE_METADATA = [
'hide', 'hide',
'material_index', 'material_index',
@ -141,9 +137,6 @@ SPLINE_METADATA = [
class BlCurve(BlDatablock): class BlCurve(BlDatablock):
bl_id = "curves" bl_id = "curves"
bl_class = bpy.types.Curve bl_class = bpy.types.Curve
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'CURVE_DATA' bl_icon = 'CURVE_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -161,28 +154,27 @@ class BlCurve(BlDatablock):
for spline in data['splines'].values(): for spline in data['splines'].values():
new_spline = target.splines.new(spline['type']) new_spline = target.splines.new(spline['type'])
# Load curve geometry data # Load curve geometry data
if new_spline.type == 'BEZIER': if new_spline.type == 'BEZIER':
bezier_points = new_spline.bezier_points bezier_points = new_spline.bezier_points
bezier_points.add(spline['bezier_points_count']) bezier_points.add(spline['bezier_points_count'])
np_load_collection(spline['bezier_points'], bezier_points, SPLINE_BEZIER_POINT) np_load_collection(
spline['bezier_points'],
bezier_points,
SPLINE_BEZIER_POINT)
if new_spline.type == 'POLY': if new_spline.type in ['POLY', 'NURBS']:
points = new_spline.points points = new_spline.points
points.add(spline['points_count']) points.add(spline['points_count'])
np_load_collection(spline['points'], points, SPLINE_POINT) np_load_collection(spline['points'], points, SPLINE_POINT)
# Not working for now...
# See https://blender.stackexchange.com/questions/7020/create-nurbs-surface-with-python
if new_spline.type == 'NURBS':
logging.error("NURBS not supported.")
# new_spline.points.add(len(data['splines'][spline]["points"])-1)
# for point_index in data['splines'][spline]["points"]:
# loader.load(
# new_spline.points[point_index], data['splines'][spline]["points"][point_index])
loader.load(new_spline, spline) loader.load(new_spline, spline)
# MATERIAL SLOTS
src_materials = data.get('materials', None)
if src_materials:
load_materials_slots(src_materials, target.materials)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
dumper = Dumper() dumper = Dumper()
@ -210,12 +202,13 @@ class BlCurve(BlDatablock):
dumper.include_filter = SPLINE_METADATA dumper.include_filter = SPLINE_METADATA
spline_data = dumper.dump(spline) spline_data = dumper.dump(spline)
if spline.type == 'POLY':
spline_data['points_count'] = len(spline.points)-1 spline_data['points_count'] = len(spline.points)-1
spline_data['points'] = np_dump_collection(spline.points, SPLINE_POINT) spline_data['points'] = np_dump_collection(
spline.points, SPLINE_POINT)
spline_data['bezier_points_count'] = len(spline.bezier_points)-1 spline_data['bezier_points_count'] = len(spline.bezier_points)-1
spline_data['bezier_points'] = np_dump_collection(spline.bezier_points, SPLINE_BEZIER_POINT) spline_data['bezier_points'] = np_dump_collection(
spline.bezier_points, SPLINE_BEZIER_POINT)
data['splines'][index] = spline_data data['splines'][index] = spline_data
if isinstance(instance, T.SurfaceCurve): if isinstance(instance, T.SurfaceCurve):
@ -224,6 +217,9 @@ class BlCurve(BlDatablock):
data['type'] = 'FONT' data['type'] = 'FONT'
elif isinstance(instance, T.Curve): elif isinstance(instance, T.Curve):
data['type'] = 'CURVE' data['type'] = 'CURVE'
data['materials'] = dump_materials_slots(instance.materials)
return data return data
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):
@ -238,4 +234,15 @@ class BlCurve(BlDatablock):
curve.font_bold_italic, curve.font_bold_italic,
curve.font_italic]) curve.font_italic])
for material in self.instance.materials:
if material:
deps.append(material)
return deps return deps
def diff(self):
if 'EDIT' in bpy.context.mode \
and not self.preferences.sync_flags.sync_during_editmode:
return False
else:
return super().diff()

View File

@ -56,7 +56,7 @@ def load_driver(target_datablock, src_driver):
loader = Loader() loader = Loader()
drivers = target_datablock.animation_data.drivers drivers = target_datablock.animation_data.drivers
src_driver_data = src_driver['driver'] src_driver_data = src_driver['driver']
new_driver = drivers.new(src_driver['data_path']) new_driver = drivers.new(src_driver['data_path'], index=src_driver['array_index'])
# Settings # Settings
new_driver.driver.type = src_driver_data['type'] new_driver.driver.type = src_driver_data['type']
@ -106,9 +106,6 @@ class BlDatablock(ReplicatedDatablock):
bl_id : blender internal storage identifier bl_id : blender internal storage identifier
bl_class : blender internal type bl_class : blender internal type
bl_delay_refresh : refresh rate in second for observers
bl_delay_apply : refresh rate in sec for apply
bl_automatic_push : boolean
bl_icon : type icon (blender icon name) bl_icon : type icon (blender icon name)
bl_check_common: enable check even in common rights bl_check_common: enable check even in common rights
bl_reload_parent: reload parent bl_reload_parent: reload parent
@ -129,13 +126,7 @@ class BlDatablock(ReplicatedDatablock):
if instance and hasattr(instance, 'uuid'): if instance and hasattr(instance, 'uuid'):
instance.uuid = self.uuid instance.uuid = self.uuid
if logging.getLogger().level == logging.DEBUG: def resolve(self, construct = True):
self.diff_method = DIFF_JSON
else:
self.diff_method = DIFF_BINARY
def resolve(self):
datablock_ref = None
datablock_root = getattr(bpy.data, self.bl_id) datablock_root = getattr(bpy.data, self.bl_id)
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root) datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
@ -143,14 +134,20 @@ class BlDatablock(ReplicatedDatablock):
try: try:
datablock_ref = datablock_root[self.data['name']] datablock_ref = datablock_root[self.data['name']]
except Exception: except Exception:
pass
if construct and not datablock_ref:
name = self.data.get('name') name = self.data.get('name')
logging.debug(f"Constructing {name}") logging.debug(f"Constructing {name}")
datablock_ref = self._construct(data=self.data) datablock_ref = self._construct(data=self.data)
if datablock_ref: if datablock_ref is not None:
setattr(datablock_ref, 'uuid', self.uuid) setattr(datablock_ref, 'uuid', self.uuid)
self.instance = datablock_ref self.instance = datablock_ref
return True
else:
return False
def remove_instance(self): def remove_instance(self):
""" """
@ -203,6 +200,9 @@ class BlDatablock(ReplicatedDatablock):
if 'action' in data['animation_data']: if 'action' in data['animation_data']:
target.animation_data.action = bpy.data.actions[data['animation_data']['action']] target.animation_data.action = bpy.data.actions[data['animation_data']['action']]
# Remove existing animation data if there is not more to load
elif hasattr(target, 'animation_data') and target.animation_data:
target.animation_data_clear()
if self.is_library: if self.is_library:
return return

View File

@ -54,9 +54,6 @@ class BlFile(ReplicatedDatablock):
bl_id = 'file' bl_id = 'file'
bl_name = "file" bl_name = "file"
bl_class = Path bl_class = Path
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'FILE' bl_icon = 'FILE'
bl_reload_parent = True bl_reload_parent = True
@ -69,18 +66,20 @@ class BlFile(ReplicatedDatablock):
raise FileNotFoundError(str(self.instance)) raise FileNotFoundError(str(self.instance))
self.preferences = utils.get_preferences() self.preferences = utils.get_preferences()
self.diff_method = DIFF_BINARY
def resolve(self): def resolve(self, construct = True):
if self.data:
self.instance = Path(get_filepath(self.data['name'])) self.instance = Path(get_filepath(self.data['name']))
if not self.instance.exists(): file_exists = self.instance.exists()
if not file_exists:
logging.debug("File don't exist, loading it.") logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance) self._load(self.data, self.instance)
def push(self, socket, identity=None): return file_exists
super().push(socket, identity=None)
def push(self, socket, identity=None, check_data=False):
super().push(socket, identity=None, check_data=False)
if self.preferences.clear_memory_filecache: if self.preferences.clear_memory_filecache:
del self.data['file'] del self.data['file']

View File

@ -30,9 +30,6 @@ from .dump_anything import Dumper, Loader
class BlFont(BlDatablock): class BlFont(BlDatablock):
bl_id = "fonts" bl_id = "fonts"
bl_class = bpy.types.VectorFont bl_class = bpy.types.VectorFont
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'FILE_FONT' bl_icon = 'FILE_FONT'
bl_reload_parent = False bl_reload_parent = False

View File

@ -109,7 +109,9 @@ def load_stroke(stroke_data, stroke):
stroke.points.add(stroke_data["p_count"]) stroke.points.add(stroke_data["p_count"])
np_load_collection(stroke_data['points'], stroke.points, STROKE_POINT) np_load_collection(stroke_data['points'], stroke.points, STROKE_POINT)
# HACK: Temporary fix to trigger a BKE_gpencil_stroke_geometry_update to
# fix fill issues
stroke.uv_scale = stroke_data["uv_scale"]
def dump_frame(frame): def dump_frame(frame):
""" Dump a grease pencil frame to a dict """ Dump a grease pencil frame to a dict
@ -226,13 +228,9 @@ def load_layer(layer_data, layer):
load_frame(frame_data, target_frame) load_frame(frame_data, target_frame)
class BlGpencil(BlDatablock): class BlGpencil(BlDatablock):
bl_id = "grease_pencils" bl_id = "grease_pencils"
bl_class = bpy.types.GreasePencil bl_class = bpy.types.GreasePencil
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'GREASEPENCIL' bl_icon = 'GREASEPENCIL'
bl_reload_parent = False bl_reload_parent = False
@ -265,6 +263,7 @@ class BlGpencil(BlDatablock):
load_layer(layer_data, target_layer) load_layer(layer_data, target_layer)
target.layers.update()
@ -287,6 +286,8 @@ class BlGpencil(BlDatablock):
for layer in instance.layers: for layer in instance.layers:
data['layers'][layer.info] = dump_layer(layer) data['layers'][layer.info] = dump_layer(layer)
data["active_layers"] = instance.layers.active.info
data["eval_frame"] = bpy.context.scene.frame_current
return data return data
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):
@ -296,3 +297,18 @@ class BlGpencil(BlDatablock):
deps.append(material) deps.append(material)
return deps return deps
def layer_changed(self):
return self.instance.layers.active.info != self.data["active_layers"]
def frame_changed(self):
return bpy.context.scene.frame_current != self.data["eval_frame"]
def diff(self):
if self.layer_changed() \
or self.frame_changed() \
or bpy.context.mode == 'OBJECT' \
or self.preferences.sync_flags.sync_during_editmode:
return super().diff()
else:
return False

View File

@ -51,9 +51,6 @@ format_to_ext = {
class BlImage(BlDatablock): class BlImage(BlDatablock):
bl_id = "images" bl_id = "images"
bl_class = bpy.types.Image bl_class = bpy.types.Image
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'IMAGE_DATA' bl_icon = 'IMAGE_DATA'
bl_reload_parent = False bl_reload_parent = False

View File

@ -29,9 +29,6 @@ POINT = ['co', 'weight_softbody', 'co_deform']
class BlLattice(BlDatablock): class BlLattice(BlDatablock):
bl_id = "lattices" bl_id = "lattices"
bl_class = bpy.types.Lattice bl_class = bpy.types.Lattice
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'LATTICE_DATA' bl_icon = 'LATTICE_DATA'
bl_reload_parent = False bl_reload_parent = False

View File

@ -26,9 +26,6 @@ from .bl_datablock import BlDatablock
class BlLibrary(BlDatablock): class BlLibrary(BlDatablock):
bl_id = "libraries" bl_id = "libraries"
bl_class = bpy.types.Library bl_class = bpy.types.Library
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'LIBRARY_DATA_DIRECT' bl_icon = 'LIBRARY_DATA_DIRECT'
bl_reload_parent = False bl_reload_parent = False

View File

@ -26,9 +26,6 @@ from .bl_datablock import BlDatablock
class BlLight(BlDatablock): class BlLight(BlDatablock):
bl_id = "lights" bl_id = "lights"
bl_class = bpy.types.Light bl_class = bpy.types.Light
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'LIGHT_DATA' bl_icon = 'LIGHT_DATA'
bl_reload_parent = False bl_reload_parent = False

View File

@ -27,9 +27,6 @@ from .bl_datablock import BlDatablock
class BlLightprobe(BlDatablock): class BlLightprobe(BlDatablock):
bl_id = "lightprobes" bl_id = "lightprobes"
bl_class = bpy.types.LightProbe bl_class = bpy.types.LightProbe
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'LIGHTPROBE_GRID' bl_icon = 'LIGHTPROBE_GRID'
bl_reload_parent = False bl_reload_parent = False

View File

@ -27,9 +27,9 @@ from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid from .bl_datablock import BlDatablock, get_datablock_from_uuid
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]') NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
def load_node(node_data, node_tree):
""" Load a node into a node_tree from a dict """ Load a node into a node_tree from a dict
:arg node_data: dumped node data :arg node_data: dumped node data
@ -52,27 +52,135 @@ def load_node(node_data, node_tree):
inputs_data = node_data.get('inputs') inputs_data = node_data.get('inputs')
if inputs_data: if inputs_data:
inputs = target_node.inputs inputs = [i for i in target_node.inputs if i.type not in IGNORED_SOCKETS]
for idx, inpt in enumerate(inputs_data): for idx, inpt in enumerate(inputs):
if idx < len(inputs) and hasattr(inputs[idx], "default_value"): if idx < len(inputs_data) and hasattr(inpt, "default_value"):
loaded_input = inputs_data[idx]
try: try:
inputs[idx].default_value = inpt if inpt.type in ['OBJECT', 'COLLECTION']:
inpt.default_value = get_datablock_from_uuid(loaded_input, None)
else:
inpt.default_value = loaded_input
except Exception as e: except Exception as e:
logging.warning(f"Node {target_node.name} input {inputs[idx].name} parameter not supported, skipping ({e})") logging.warning(f"Node {target_node.name} input {inpt.name} parameter not supported, skipping ({e})")
else: else:
logging.warning(f"Node {target_node.name} input length mismatch.") logging.warning(f"Node {target_node.name} input length mismatch.")
outputs_data = node_data.get('outputs') outputs_data = node_data.get('outputs')
if outputs_data: if outputs_data:
outputs = target_node.outputs outputs = [o for o in target_node.outputs if o.type not in IGNORED_SOCKETS]
for idx, output in enumerate(outputs_data): for idx, output in enumerate(outputs):
if idx < len(outputs) and hasattr(outputs[idx], "default_value"): if idx < len(outputs_data) and hasattr(output, "default_value"):
loaded_output = outputs_data[idx]
try: try:
outputs[idx].default_value = output if output.type in ['OBJECT', 'COLLECTION']:
except: output.default_value = get_datablock_from_uuid(loaded_output, None)
logging.warning(f"Node {target_node.name} output {outputs[idx].name} parameter not supported, skipping ({e})")
else: else:
logging.warning(f"Node {target_node.name} output length mismatch.") output.default_value = loaded_output
except Exception as e:
logging.warning(
f"Node {target_node.name} output {output.name} parameter not supported, skipping ({e})")
else:
logging.warning(
f"Node {target_node.name} output length mismatch.")
def dump_node(node: bpy.types.ShaderNode) -> dict:
""" Dump a single node to a dict
:arg node: target node
:type node: bpy.types.Node
:retrun: dict
"""
node_dumper = Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded",
"name_full",
"select",
"bl_label",
"bl_height_min",
"bl_height_max",
"bl_height_default",
"bl_width_min",
"bl_width_max",
"type",
"bl_icon",
"bl_width_default",
"bl_static_type",
"show_tetxure",
"is_active_output",
"hide",
"show_options",
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
if node.parent:
dumped_node['parent'] = node.parent.name
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
if dump_io_needed:
io_dumper = Dumper()
io_dumper.depth = 2
io_dumper.include_filter = ["default_value"]
if hasattr(node, 'inputs'):
dumped_node['inputs'] = []
inputs = [i for i in node.inputs if i.type not in IGNORED_SOCKETS]
for idx, inpt in enumerate(inputs):
if hasattr(inpt, 'default_value'):
if isinstance(inpt.default_value, bpy.types.ID):
dumped_input = inpt.default_value.uuid
else:
dumped_input = io_dumper.dump(inpt.default_value)
dumped_node['inputs'].append(dumped_input)
if hasattr(node, 'outputs'):
dumped_node['outputs'] = []
for idx, output in enumerate(node.outputs):
if output.type not in IGNORED_SOCKETS:
if hasattr(output, 'default_value'):
dumped_node['outputs'].append(
io_dumper.dump(output.default_value))
if hasattr(node, 'color_ramp'):
ramp_dumper = Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position',
'interpolation',
'hue_interpolation',
'color_mode'
]
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
dumped_node['node_tree_uuid'] = node.node_tree.uuid
return dumped_node
def load_links(links_data, node_tree): def load_links(links_data, node_tree):
@ -117,92 +225,7 @@ def dump_links(links):
return links_data return links_data
def dump_node(node): def dump_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
""" Dump a single node to a dict
:arg node: target node
:type node: bpy.types.Node
:retrun: dict
"""
node_dumper = Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded",
"name_full",
"select",
"bl_label",
"bl_height_min",
"bl_height_max",
"bl_height_default",
"bl_width_min",
"bl_width_max",
"type",
"bl_icon",
"bl_width_default",
"bl_static_type",
"show_tetxure",
"is_active_output",
"hide",
"show_options",
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
dump_io_needed = (node.type not in ['REROUTE','OUTPUT_MATERIAL'])
if dump_io_needed:
io_dumper = Dumper()
io_dumper.depth = 2
io_dumper.include_filter = ["default_value"]
if hasattr(node, 'inputs'):
dumped_node['inputs'] = []
for idx, inpt in enumerate(node.inputs):
if hasattr(inpt, 'default_value'):
dumped_node['inputs'].append(io_dumper.dump(inpt.default_value))
if hasattr(node, 'outputs'):
dumped_node['outputs'] = []
for idx, output in enumerate(node.outputs):
if hasattr(output, 'default_value'):
dumped_node['outputs'].append(io_dumper.dump(output.default_value))
if hasattr(node, 'color_ramp'):
ramp_dumper = Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position',
'interpolation',
'color_mode'
]
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
dumped_node['node_tree_uuid'] = node.node_tree.uuid
return dumped_node
def dump_shader_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
""" Dump a shader node_tree to a dict including links and nodes """ Dump a shader node_tree to a dict including links and nodes
:arg node_tree: dumped shader node tree :arg node_tree: dumped shader node tree
@ -244,6 +267,7 @@ def dump_node_tree_sockets(sockets: bpy.types.Collection)->dict:
return sockets_data return sockets_data
def load_node_tree_sockets(sockets: bpy.types.Collection, def load_node_tree_sockets(sockets: bpy.types.Collection,
sockets_data: dict): sockets_data: dict):
""" load sockets of a shader_node_tree """ load sockets of a shader_node_tree
@ -257,7 +281,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
""" """
# Check for removed sockets # Check for removed sockets
for socket in sockets: for socket in sockets:
if not [s for s in sockets_data if socket['uuid'] == s[2]]: if not [s for s in sockets_data if 'uuid' in socket and socket['uuid'] == s[2]]:
sockets.remove(socket) sockets.remove(socket)
# Check for new sockets # Check for new sockets
@ -271,7 +295,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
s['uuid'] = socket_data[2] s['uuid'] = socket_data[2]
def load_shader_node_tree(node_tree_data:dict, target_node_tree:bpy.types.ShaderNodeTree)->dict: def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
"""Load a shader node_tree from dumped data """Load a shader node_tree from dumped data
:arg node_tree_data: dumped node data :arg node_tree_data: dumped node data
@ -297,6 +321,14 @@ def load_shader_node_tree(node_tree_data:dict, target_node_tree:bpy.types.Shader
for node in node_tree_data["nodes"]: for node in node_tree_data["nodes"]:
load_node(node_tree_data["nodes"][node], target_node_tree) load_node(node_tree_data["nodes"][node], target_node_tree)
for node_id, node_data in node_tree_data["nodes"].items():
target_node = target_node_tree.nodes.get(node_id, None)
if target_node is None:
continue
elif 'parent' in node_data:
target_node.parent = target_node_tree.nodes[node_data['parent']]
else:
target_node.parent = None
# TODO: load only required nodes links # TODO: load only required nodes links
# Load nodes links # Load nodes links
target_node_tree.links.clear() target_node_tree.links.clear()
@ -305,9 +337,14 @@ def load_shader_node_tree(node_tree_data:dict, target_node_tree:bpy.types.Shader
def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list: def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
has_image = lambda node : (node.type in ['TEX_IMAGE', 'TEX_ENVIRONMENT'] and node.image) def has_image(node): return (
has_node_group = lambda node : (hasattr(node,'node_tree') and node.node_tree) node.type in ['TEX_IMAGE', 'TEX_ENVIRONMENT'] and node.image)
def has_node_group(node): return (
hasattr(node, 'node_tree') and node.node_tree)
def has_texture(node): return (
node.type in ['ATTRIBUTE_SAMPLE_TEXTURE','TEXTURE'] and node.texture)
deps = [] deps = []
for node in node_tree.nodes: for node in node_tree.nodes:
@ -315,16 +352,46 @@ def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
deps.append(node.image) deps.append(node.image)
elif has_node_group(node): elif has_node_group(node):
deps.append(node.node_tree) deps.append(node.node_tree)
elif has_texture(node):
deps.append(node.texture)
return deps return deps
def dump_materials_slots(materials: bpy.types.bpy_prop_collection) -> list:
""" Dump material slots collection
:arg materials: material slots collection to dump
:type materials: bpy.types.bpy_prop_collection
:return: list of tuples (mat_uuid, mat_name)
"""
return [(m.uuid, m.name) for m in materials if m]
def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_collection):
""" Load material slots
:arg src_materials: dumped material collection (ex: object.materials)
:type src_materials: list of tuples (uuid, name)
:arg dst_materials: target material collection pointer
:type dst_materials: bpy.types.bpy_prop_collection
"""
# MATERIAL SLOTS
dst_materials.clear()
for mat_uuid, mat_name in src_materials:
mat_ref = None
if mat_uuid is not None:
mat_ref = get_datablock_from_uuid(mat_uuid, None)
else:
mat_ref = bpy.data.materials[mat_name]
dst_materials.append(mat_ref)
class BlMaterial(BlDatablock): class BlMaterial(BlDatablock):
bl_id = "materials" bl_id = "materials"
bl_class = bpy.types.Material bl_class = bpy.types.Material
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'MATERIAL_DATA' bl_icon = 'MATERIAL_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -348,7 +415,7 @@ class BlMaterial(BlDatablock):
if target.node_tree is None: if target.node_tree is None:
target.use_nodes = True target.use_nodes = True
load_shader_node_tree(data['node_tree'], target.node_tree) load_node_tree(data['node_tree'], target.node_tree)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
@ -409,10 +476,13 @@ class BlMaterial(BlDatablock):
'fill_style', 'fill_style',
'gradient_type', 'gradient_type',
# 'fill_image', # 'fill_image',
'use_stroke_holdout',
'use_overlap_strokes',
'use_fill_holdout',
] ]
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil) data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
elif instance.use_nodes: elif instance.use_nodes:
data['node_tree'] = dump_shader_node_tree(instance.node_tree) data['node_tree'] = dump_node_tree(instance.node_tree)
return data return data

View File

@ -26,6 +26,7 @@ from .dump_anything import Dumper, Loader, np_load_collection_primitives, np_dum
from replication.constants import DIFF_BINARY from replication.constants import DIFF_BINARY
from replication.exception import ContextError from replication.exception import ContextError
from .bl_datablock import BlDatablock, get_datablock_from_uuid from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .bl_material import dump_materials_slots, load_materials_slots
VERTICE = ['co'] VERTICE = ['co']
@ -33,6 +34,8 @@ EDGE = [
'vertices', 'vertices',
'crease', 'crease',
'bevel_weight', 'bevel_weight',
'use_seam',
'use_edge_sharp',
] ]
LOOP = [ LOOP = [
'vertex_index', 'vertex_index',
@ -49,12 +52,9 @@ POLYGON = [
class BlMesh(BlDatablock): class BlMesh(BlDatablock):
bl_id = "meshes" bl_id = "meshes"
bl_class = bpy.types.Mesh bl_class = bpy.types.Mesh
bl_delay_refresh = 2
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'MESH_DATA' bl_icon = 'MESH_DATA'
bl_reload_parent = False bl_reload_parent = True
def _construct(self, data): def _construct(self, data):
instance = bpy.data.meshes.new(data["name"]) instance = bpy.data.meshes.new(data["name"])
@ -69,19 +69,9 @@ class BlMesh(BlDatablock):
loader.load(target, data) loader.load(target, data)
# MATERIAL SLOTS # MATERIAL SLOTS
target.materials.clear() src_materials = data.get('materials', None)
if src_materials:
for mat_uuid, mat_name in data["material_list"]: load_materials_slots(src_materials, target.materials)
mat_ref = None
if mat_uuid is not None:
mat_ref = get_datablock_from_uuid(mat_uuid, None)
else:
mat_ref = bpy.data.materials.get(mat_name, None)
if mat_ref is None:
raise Exception("Material doesn't exist")
target.materials.append(mat_ref)
# CLEAR GEOMETRY # CLEAR GEOMETRY
if target.vertices: if target.vertices:
@ -126,7 +116,7 @@ class BlMesh(BlDatablock):
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
if instance.is_editmode and not self.preferences.sync_flags.sync_during_editmode: if (instance.is_editmode or bpy.context.mode == "SCULPT") and not self.preferences.sync_flags.sync_during_editmode:
raise ContextError("Mesh is in edit mode") raise ContextError("Mesh is in edit mode")
mesh = instance mesh = instance
@ -172,9 +162,8 @@ class BlMesh(BlDatablock):
data['vertex_colors'][color_map.name] = {} data['vertex_colors'][color_map.name] = {}
data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color') data['vertex_colors'][color_map.name]['data'] = np_dump_collection_primitive(color_map.data, 'color')
# Fix material index # Materials
data['material_list'] = [(m.uuid, m.name) for m in instance.materials if m] data['materials'] = dump_materials_slots(instance.materials)
return data return data
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):
@ -185,3 +174,10 @@ class BlMesh(BlDatablock):
deps.append(material) deps.append(material)
return deps return deps
def diff(self):
if 'EDIT' in bpy.context.mode \
and not self.preferences.sync_flags.sync_during_editmode:
return False
else:
return super().diff()

View File

@ -65,9 +65,6 @@ def load_metaball_elements(elements_data, elements):
class BlMetaball(BlDatablock): class BlMetaball(BlDatablock):
bl_id = "metaballs" bl_id = "metaballs"
bl_class = bpy.types.MetaBall bl_class = bpy.types.MetaBall
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'META_BALL' bl_icon = 'META_BALL'
bl_reload_parent = False bl_reload_parent = False

View File

@ -21,16 +21,13 @@ import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .bl_material import (dump_shader_node_tree, from .bl_material import (dump_node_tree,
load_shader_node_tree, load_node_tree,
get_node_tree_dependencies) get_node_tree_dependencies)
class BlNodeGroup(BlDatablock): class BlNodeGroup(BlDatablock):
bl_id = "node_groups" bl_id = "node_groups"
bl_class = bpy.types.ShaderNodeTree bl_class = bpy.types.NodeTree
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'NODETREE' bl_icon = 'NODETREE'
bl_reload_parent = False bl_reload_parent = False
@ -39,10 +36,10 @@ class BlNodeGroup(BlDatablock):
return bpy.data.node_groups.new(data["name"], data["type"]) return bpy.data.node_groups.new(data["name"], data["type"])
def _load_implementation(self, data, target): def _load_implementation(self, data, target):
load_shader_node_tree(data, target) load_node_tree(data, target)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
return dump_shader_node_tree(instance) return dump_node_tree(instance)
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):
return get_node_tree_dependencies(self.instance) return get_node_tree_dependencies(self.instance)

View File

@ -17,13 +17,139 @@
import logging import logging
import re
import bpy import bpy
import mathutils import mathutils
from replication.exception import ContextError from replication.exception import ContextError
from .bl_datablock import BlDatablock, get_datablock_from_uuid from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .dump_anything import Dumper, Loader from .bl_material import IGNORED_SOCKETS
from .dump_anything import (
Dumper,
Loader,
np_load_collection,
np_dump_collection)
SKIN_DATA = [
'radius',
'use_loose',
'use_root'
]
if bpy.app.version[1] >= 93:
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str, float)
else:
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str)
logging.warning("Geometry node Float parameter not supported in \
blender 2.92.")
def get_node_group_inputs(node_group):
inputs = []
for inpt in node_group.inputs:
if inpt.type in IGNORED_SOCKETS:
continue
else:
inputs.append(inpt)
return inputs
# return [inpt.identifer for inpt in node_group.inputs if inpt.type not in IGNORED_SOCKETS]
def dump_physics(target: bpy.types.Object)->dict:
"""
Dump all physics settings from a given object excluding modifier
related physics settings (such as softbody, cloth, dynapaint and fluid)
"""
dumper = Dumper()
dumper.depth = 1
physics_data = {}
# Collisions (collision)
if target.collision and target.collision.use:
physics_data['collision'] = dumper.dump(target.collision)
# Field (field)
if target.field and target.field.type != "NONE":
physics_data['field'] = dumper.dump(target.field)
# Rigid Body (rigid_body)
if target.rigid_body:
physics_data['rigid_body'] = dumper.dump(target.rigid_body)
# Rigid Body constraint (rigid_body_constraint)
if target.rigid_body_constraint:
physics_data['rigid_body_constraint'] = dumper.dump(target.rigid_body_constraint)
return physics_data
def load_physics(dumped_settings: dict, target: bpy.types.Object):
""" Load all physics settings from a given object excluding modifier
related physics settings (such as softbody, cloth, dynapaint and fluid)
"""
loader = Loader()
if 'collision' in dumped_settings:
loader.load(target.collision, dumped_settings['collision'])
if 'field' in dumped_settings:
loader.load(target.field, dumped_settings['field'])
if 'rigid_body' in dumped_settings:
if not target.rigid_body:
bpy.ops.rigidbody.object_add({"object": target})
loader.load(target.rigid_body, dumped_settings['rigid_body'])
elif target.rigid_body:
bpy.ops.rigidbody.object_remove({"object": target})
if 'rigid_body_constraint' in dumped_settings:
if not target.rigid_body_constraint:
bpy.ops.rigidbody.constraint_add({"object": target})
loader.load(target.rigid_body_constraint, dumped_settings['rigid_body_constraint'])
elif target.rigid_body_constraint:
bpy.ops.rigidbody.constraint_remove({"object": target})
def dump_modifier_geometry_node_inputs(modifier: bpy.types.Modifier) -> list:
""" Dump geometry node modifier input properties
:arg modifier: geometry node modifier to dump
:type modifier: bpy.type.Modifier
"""
dumped_inputs = []
for inpt in get_node_group_inputs(modifier.node_group):
input_value = modifier[inpt.identifier]
dumped_input = None
if isinstance(input_value, bpy.types.ID):
dumped_input = input_value.uuid
elif isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
dumped_input = input_value
elif hasattr(input_value, 'to_list'):
dumped_input = input_value.to_list()
dumped_inputs.append(dumped_input)
return dumped_inputs
def load_modifier_geometry_node_inputs(dumped_modifier: dict, target_modifier: bpy.types.Modifier):
""" Load geometry node modifier inputs
:arg dumped_modifier: source dumped modifier to load
:type dumped_modifier: dict
:arg target_modifier: target geometry node modifier
:type target_modifier: bpy.type.Modifier
"""
for input_index, inpt in enumerate(get_node_group_inputs(target_modifier.node_group)):
dumped_value = dumped_modifier['inputs'][input_index]
input_value = target_modifier[inpt.identifier]
if isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
target_modifier[inpt.identifier] = dumped_value
elif hasattr(input_value, 'to_list'):
for index in range(len(input_value)):
input_value[index] = dumped_value[index]
elif inpt.type in ['COLLECTION', 'OBJECT']:
target_modifier[inpt.identifier] = get_datablock_from_uuid(
dumped_value, None)
def load_pose(target_bone, data): def load_pose(target_bone, data):
@ -81,25 +207,91 @@ def _is_editmode(object: bpy.types.Object) -> bool:
child_data.is_editmode) child_data.is_editmode)
def find_textures_dependencies(collection): def find_textures_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.Texture]:
""" Check collection """ Find textures lying in a modifier stack
:arg modifiers: modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
:return: list of bpy.types.Texture pointers
""" """
textures = [] textures = []
for item in collection: for mod in modifiers:
for attr in dir(item): modifier_attributes = [getattr(mod, attr_name)
inst = getattr(item, attr) for attr_name in mod.bl_rna.properties.keys()]
if issubclass(type(inst), bpy.types.Texture) and inst is not None: for attr in modifier_attributes:
textures.append(inst) if issubclass(type(attr), bpy.types.Texture) and attr is not None:
textures.append(attr)
return textures return textures
def find_geometry_nodes_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.NodeTree]:
""" Find geometry nodes dependencies from a modifier stack
:arg modifiers: modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
:return: list of bpy.types.NodeTree pointers
"""
dependencies = []
for mod in modifiers:
if mod.type == 'NODES' and mod.node_group:
dependencies.append(mod.node_group)
# for inpt in get_node_group_inputs(mod.node_group):
# parameter = mod.get(inpt.identifier)
# if parameter and isinstance(parameter, bpy.types.ID):
# dependencies.append(parameter)
return dependencies
def dump_vertex_groups(src_object: bpy.types.Object) -> dict:
""" Dump object's vertex groups
:param target_object: dump vertex groups of this object
:type target_object: bpy.types.Object
"""
if isinstance(src_object.data, bpy.types.GreasePencil):
logging.warning(
"Grease pencil vertex groups are not supported yet. More info: https://gitlab.com/slumber/multi-user/-/issues/161")
else:
points_attr = 'vertices' if isinstance(
src_object.data, bpy.types.Mesh) else 'points'
dumped_vertex_groups = {}
# Vertex group metadata
for vg in src_object.vertex_groups:
dumped_vertex_groups[vg.index] = {
'name': vg.name,
'vertices': []
}
# Vertex group assignation
for vert in getattr(src_object.data, points_attr):
for vg in vert.groups:
vertices = dumped_vertex_groups.get(vg.group)['vertices']
vertices.append((vert.index, vg.weight))
return dumped_vertex_groups
def load_vertex_groups(dumped_vertex_groups: dict, target_object: bpy.types.Object):
""" Load object vertex groups
:param dumped_vertex_groups: vertex_groups to load
:type dumped_vertex_groups: dict
:param target_object: object to load the vertex groups into
:type target_object: bpy.types.Object
"""
target_object.vertex_groups.clear()
for vg in dumped_vertex_groups.values():
vertex_group = target_object.vertex_groups.new(name=vg['name'])
for index, weight in vg['vertices']:
vertex_group.add([index], weight, 'REPLACE')
class BlObject(BlDatablock): class BlObject(BlDatablock):
bl_id = "objects" bl_id = "objects"
bl_class = bpy.types.Object bl_class = bpy.types.Object
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'OBJECT_DATA' bl_icon = 'OBJECT_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -125,6 +317,10 @@ class BlObject(BlDatablock):
data_uuid, data_uuid,
find_data_from_name(data_id), find_data_from_name(data_id),
ignore=['images']) # TODO: use resolve_from_id ignore=['images']) # TODO: use resolve_from_id
if object_data is None and data_uuid:
raise Exception(f"Fail to load object {data['name']}({self.uuid})")
instance = bpy.data.objects.new(object_name, object_data) instance = bpy.data.objects.new(object_name, object_data)
instance.uuid = self.uuid instance.uuid = self.uuid
@ -141,21 +337,16 @@ class BlObject(BlDatablock):
data_uuid, find_data_from_name(data_id), ignore=['images']) data_uuid, find_data_from_name(data_id), ignore=['images'])
# vertex groups # vertex groups
if 'vertex_groups' in data: vertex_groups = data.get('vertex_groups', None)
target.vertex_groups.clear() if vertex_groups:
for vg in data['vertex_groups']: load_vertex_groups(vertex_groups, target)
vertex_group = target.vertex_groups.new(name=vg['name'])
point_attr = 'vertices' if 'vertices' in vg else 'points' object_data = target.data
for vert in vg[point_attr]:
vertex_group.add(
[vert['index']], vert['weight'], 'REPLACE')
# SHAPE KEYS # SHAPE KEYS
if 'shape_keys' in data: if 'shape_keys' in data:
target.shape_key_clear() target.shape_key_clear()
object_data = target.data
# Create keys and load vertices coords # Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']: for key_block in data['shape_keys']['key_blocks']:
key_data = data['shape_keys']['key_blocks'][key_block] key_data = data['shape_keys']['key_blocks'][key_block]
@ -175,8 +366,20 @@ class BlObject(BlDatablock):
# Load transformation data # Load transformation data
loader.load(target, data) loader.load(target, data)
# Object display fields
if 'display' in data:
loader.load(target.display, data['display']) loader.load(target.display, data['display'])
# Parenting
parent_id = data.get('parent_uid')
if parent_id:
parent = get_datablock_from_uuid(parent_id[0], bpy.data.objects[parent_id[1]])
# Avoid reloading
if target.parent != parent and parent is not None:
target.parent = parent
elif target.parent:
target.parent = None
# Pose # Pose
if 'pose' in data: if 'pose' in data:
if not target.pose: if not target.pose:
@ -211,6 +414,59 @@ class BlObject(BlDatablock):
if target.data is None and img_uuid: if target.data is None and img_uuid:
target.data = get_datablock_from_uuid(img_uuid, None) target.data = get_datablock_from_uuid(img_uuid, None)
if hasattr(object_data, 'skin_vertices') \
and object_data.skin_vertices\
and 'skin_vertices' in data:
for index, skin_data in enumerate(object_data.skin_vertices):
np_load_collection(
data['skin_vertices'][index],
skin_data.data,
SKIN_DATA)
if hasattr(target, 'cycles_visibility') \
and 'cycles_visibility' in data:
loader.load(target.cycles_visibility, data['cycles_visibility'])
# TODO: handle geometry nodes input from dump_anything
if hasattr(target, 'modifiers'):
nodes_modifiers = [
mod for mod in target.modifiers if mod.type == 'NODES']
for modifier in nodes_modifiers:
load_modifier_geometry_node_inputs(
data['modifiers'][modifier.name], modifier)
particles_modifiers = [
mod for mod in target.modifiers if mod.type == 'PARTICLE_SYSTEM']
for mod in particles_modifiers:
default = mod.particle_system.settings
dumped_particles = data['modifiers'][mod.name]['particle_system']
loader.load(mod.particle_system, dumped_particles)
settings = get_datablock_from_uuid(dumped_particles['settings_uuid'], None)
if settings:
mod.particle_system.settings = settings
# Hack to remove the default generated particle settings
if not default.uuid:
bpy.data.particles.remove(default)
phys_modifiers = [
mod for mod in target.modifiers if mod.type in ['SOFT_BODY', 'CLOTH']]
for mod in phys_modifiers:
loader.load(mod.settings, data['modifiers'][mod.name]['settings'])
# PHYSICS
load_physics(data, target)
transform = data.get('transforms', None)
if transform:
target.matrix_parent_inverse = mathutils.Matrix(
transform['matrix_parent_inverse'])
target.matrix_basis = mathutils.Matrix(transform['matrix_basis'])
target.matrix_local = mathutils.Matrix(transform['matrix_local'])
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
@ -225,9 +481,7 @@ class BlObject(BlDatablock):
dumper.include_filter = [ dumper.include_filter = [
"name", "name",
"rotation_mode", "rotation_mode",
"parent",
"data", "data",
"children",
"library", "library",
"empty_display_type", "empty_display_type",
"empty_display_size", "empty_display_size",
@ -241,8 +495,6 @@ class BlObject(BlDatablock):
"color", "color",
"instance_collection", "instance_collection",
"instance_type", "instance_type",
"location",
"scale",
'lock_location', 'lock_location',
'lock_rotation', 'lock_rotation',
'lock_scale', 'lock_scale',
@ -256,12 +508,16 @@ class BlObject(BlDatablock):
'show_all_edges', 'show_all_edges',
'show_texture_space', 'show_texture_space',
'show_in_front', 'show_in_front',
'type', 'type'
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
] ]
data = dumper.dump(instance) data = dumper.dump(instance)
dumper.include_filter = [
'matrix_parent_inverse',
'matrix_local',
'matrix_basis']
data['transforms'] = dumper.dump(instance)
dumper.include_filter = [ dumper.include_filter = [
'show_shadows', 'show_shadows',
] ]
@ -271,14 +527,39 @@ class BlObject(BlDatablock):
if self.is_library: if self.is_library:
return data return data
# PARENTING
if instance.parent:
data['parent_uid'] = (instance.parent.uuid, instance.parent.name)
# MODIFIERS # MODIFIERS
if hasattr(instance, 'modifiers'):
data["modifiers"] = {}
modifiers = getattr(instance, 'modifiers', None) modifiers = getattr(instance, 'modifiers', None)
if modifiers: if modifiers:
dumper.include_filter = None dumper.include_filter = None
dumper.depth = 1 dumper.depth = 1
data["modifiers"] = {} dumper.exclude_filter = ['is_active']
for index, modifier in enumerate(modifiers): for index, modifier in enumerate(modifiers):
data["modifiers"][modifier.name] = dumper.dump(modifier) dumped_modifier = dumper.dump(modifier)
# hack to dump geometry nodes inputs
if modifier.type == 'NODES':
dumped_inputs = dump_modifier_geometry_node_inputs(
modifier)
dumped_modifier['inputs'] = dumped_inputs
elif modifier.type == 'PARTICLE_SYSTEM':
dumper.exclude_filter = [
"is_edited",
"is_editable",
"is_global_hair"
]
dumped_modifier['particle_system'] = dumper.dump(modifier.particle_system)
dumped_modifier['particle_system']['settings_uuid'] = modifier.particle_system.settings.uuid
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
dumped_modifier['settings'] = dumper.dump(modifier.settings)
data["modifiers"][modifier.name] = dumped_modifier
gp_modifiers = getattr(instance, 'grease_pencil_modifiers', None) gp_modifiers = getattr(instance, 'grease_pencil_modifiers', None)
@ -301,8 +582,10 @@ class BlObject(BlDatablock):
'location'] 'location']
gp_mod_data['curve'] = curve_dumper.dump(modifier.curve) gp_mod_data['curve'] = curve_dumper.dump(modifier.curve)
# CONSTRAINTS # CONSTRAINTS
if hasattr(instance, 'constraints'): if hasattr(instance, 'constraints'):
dumper.include_filter = None
dumper.depth = 3 dumper.depth = 3
data["constraints"] = dumper.dump(instance.constraints) data["constraints"] = dumper.dump(instance.constraints)
@ -344,39 +627,9 @@ class BlObject(BlDatablock):
bone_groups[group.name] = dumper.dump(group) bone_groups[group.name] = dumper.dump(group)
data['pose']['bone_groups'] = bone_groups data['pose']['bone_groups'] = bone_groups
# CHILDS
if len(instance.children) > 0:
childs = []
for child in instance.children:
childs.append(child.name)
data["children"] = childs
# VERTEx GROUP # VERTEx GROUP
if len(instance.vertex_groups) > 0: if len(instance.vertex_groups) > 0:
points_attr = 'vertices' if isinstance( data['vertex_groups'] = dump_vertex_groups(instance)
instance.data, bpy.types.Mesh) else 'points'
vg_data = []
for vg in instance.vertex_groups:
vg_idx = vg.index
dumped_vg = {}
dumped_vg['name'] = vg.name
vertices = []
for i, v in enumerate(getattr(instance.data, points_attr)):
for vg in v.groups:
if vg.group == vg_idx:
vertices.append({
'index': i,
'weight': vg.weight
})
dumped_vg['vertices'] = vertices
vg_data.append(dumped_vg)
data['vertex_groups'] = vg_data
# SHAPE KEYS # SHAPE KEYS
object_data = instance.data object_data = instance.data
@ -406,6 +659,29 @@ class BlObject(BlDatablock):
key_blocks[key.name]['relative_key'] = key.relative_key.name key_blocks[key.name]['relative_key'] = key.relative_key.name
data['shape_keys']['key_blocks'] = key_blocks data['shape_keys']['key_blocks'] = key_blocks
# SKIN VERTICES
if hasattr(object_data, 'skin_vertices') and object_data.skin_vertices:
skin_vertices = list()
for skin_data in object_data.skin_vertices:
skin_vertices.append(
np_dump_collection(skin_data.data, SKIN_DATA))
data['skin_vertices'] = skin_vertices
# CYCLE SETTINGS
if hasattr(instance, 'cycles_visibility'):
dumper.include_filter = [
'camera',
'diffuse',
'glossy',
'transmission',
'scatter',
'shadow',
]
data['cycles_visibility'] = dumper.dump(instance.cycles_visibility)
# PHYSICS
data.update(dump_physics(instance))
return data return data
def _resolve_deps_implementation(self): def _resolve_deps_implementation(self):
@ -414,17 +690,23 @@ class BlObject(BlDatablock):
# Avoid Empty case # Avoid Empty case
if self.instance.data: if self.instance.data:
deps.append(self.instance.data) deps.append(self.instance.data)
if len(self.instance.children) > 0:
deps.extend(list(self.instance.children)) # Particle systems
for particle_slot in self.instance.particle_systems:
deps.append(particle_slot.settings)
if self.is_library: if self.is_library:
deps.append(self.instance.library) deps.append(self.instance.library)
if self.instance.parent:
deps.append(self.instance.parent)
if self.instance.instance_type == 'COLLECTION': if self.instance.instance_type == 'COLLECTION':
# TODO: uuid based # TODO: uuid based
deps.append(self.instance.instance_collection) deps.append(self.instance.instance_collection)
if self.instance.modifiers: if self.instance.modifiers:
deps.extend(find_textures_dependencies(self.instance.modifiers)) deps.extend(find_textures_dependencies(self.instance.modifiers))
deps.extend(find_geometry_nodes_dependencies(self.instance.modifiers))
return deps return deps

View File

@ -0,0 +1,90 @@
import bpy
import mathutils
from . import dump_anything
from .bl_datablock import BlDatablock, get_datablock_from_uuid
def dump_textures_slots(texture_slots: bpy.types.bpy_prop_collection) -> list:
""" Dump every texture slot collection as the form:
[(index, slot_texture_uuid, slot_texture_name), (), ...]
"""
dumped_slots = []
for index, slot in enumerate(texture_slots):
if slot and slot.texture:
dumped_slots.append((index, slot.texture.uuid, slot.texture.name))
return dumped_slots
def load_texture_slots(dumped_slots: list, target_slots: bpy.types.bpy_prop_collection):
"""
"""
for index, slot in enumerate(target_slots):
if slot:
target_slots.clear(index)
for index, slot_uuid, slot_name in dumped_slots:
target_slots.create(index).texture = get_datablock_from_uuid(
slot_uuid, slot_name
)
IGNORED_ATTR = [
"is_embedded_data",
"is_evaluated",
"is_fluid",
"is_library_indirect",
"users"
]
class BlParticle(BlDatablock):
bl_id = "particles"
bl_class = bpy.types.ParticleSettings
bl_icon = "PARTICLES"
bl_check_common = False
bl_reload_parent = False
def _construct(self, data):
instance = bpy.data.particles.new(data["name"])
instance.uuid = self.uuid
return instance
def _load_implementation(self, data, target):
dump_anything.load(target, data)
dump_anything.load(target.effector_weights, data["effector_weights"])
# Force field
force_field_1 = data.get("force_field_1", None)
if force_field_1:
dump_anything.load(target.force_field_1, force_field_1)
force_field_2 = data.get("force_field_2", None)
if force_field_2:
dump_anything.load(target.force_field_2, force_field_2)
# Texture slots
load_texture_slots(data["texture_slots"], target.texture_slots)
def _dump_implementation(self, data, instance=None):
assert instance
dumper = dump_anything.Dumper()
dumper.depth = 1
dumper.exclude_filter = IGNORED_ATTR
data = dumper.dump(instance)
# Particle effectors
data["effector_weights"] = dumper.dump(instance.effector_weights)
if instance.force_field_1:
data["force_field_1"] = dumper.dump(instance.force_field_1)
if instance.force_field_2:
data["force_field_2"] = dumper.dump(instance.force_field_2)
# Texture slots
data["texture_slots"] = dump_textures_slots(instance.texture_slots)
return data
def _resolve_deps_implementation(self):
return [t.texture for t in self.instance.texture_slots if t and t.texture]

View File

@ -17,16 +17,19 @@
import logging import logging
from pathlib import Path
import bpy import bpy
import mathutils import mathutils
from deepdiff import DeepDiff from deepdiff import DeepDiff
from replication.constants import DIFF_JSON, MODIFIED from replication.constants import DIFF_JSON, MODIFIED
from ..utils import flush_history
from .bl_collection import (dump_collection_children, dump_collection_objects, from .bl_collection import (dump_collection_children, dump_collection_objects,
load_collection_childrens, load_collection_objects, load_collection_childrens, load_collection_objects,
resolve_collection_dependencies) resolve_collection_dependencies)
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .bl_file import get_filepath
from .dump_anything import Dumper, Loader from .dump_anything import Dumper, Loader
RENDER_SETTINGS = [ RENDER_SETTINGS = [
@ -265,28 +268,116 @@ VIEW_SETTINGS = [
] ]
def dump_sequence(sequence: bpy.types.Sequence) -> dict:
""" Dump a sequence to a dict
:arg sequence: sequence to dump
:type sequence: bpy.types.Sequence
:return dict:
"""
dumper = Dumper()
dumper.exclude_filter = [
'lock',
'select',
'select_left_handle',
'select_right_handle',
'strobe'
]
dumper.depth = 1
data = dumper.dump(sequence)
# TODO: Support multiple images
if sequence.type == 'IMAGE':
data['filenames'] = [e.filename for e in sequence.elements]
# Effect strip inputs
input_count = getattr(sequence, 'input_count', None)
if input_count:
for n in range(input_count):
input_name = f"input_{n+1}"
data[input_name] = getattr(sequence, input_name).name
return data
def load_sequence(sequence_data: dict, sequence_editor: bpy.types.SequenceEditor):
""" Load sequence from dumped data
:arg sequence_data: sequence to dump
:type sequence_data:dict
:arg sequence_editor: root sequence editor
:type sequence_editor: bpy.types.SequenceEditor
"""
strip_type = sequence_data.get('type')
strip_name = sequence_data.get('name')
strip_channel = sequence_data.get('channel')
strip_frame_start = sequence_data.get('frame_start')
sequence = sequence_editor.sequences_all.get(strip_name, None)
if sequence is None:
if strip_type == 'SCENE':
strip_scene = bpy.data.scenes.get(sequence_data.get('scene'))
sequence = sequence_editor.sequences.new_scene(strip_name,
strip_scene,
strip_channel,
strip_frame_start)
elif strip_type == 'MOVIE':
filepath = get_filepath(Path(sequence_data['filepath']).name)
sequence = sequence_editor.sequences.new_movie(strip_name,
filepath,
strip_channel,
strip_frame_start)
elif strip_type == 'SOUND':
filepath = bpy.data.sounds[sequence_data['sound']].filepath
sequence = sequence_editor.sequences.new_sound(strip_name,
filepath,
strip_channel,
strip_frame_start)
elif strip_type == 'IMAGE':
images_name = sequence_data.get('filenames')
filepath = get_filepath(images_name[0])
sequence = sequence_editor.sequences.new_image(strip_name,
filepath,
strip_channel,
strip_frame_start)
# load other images
if len(images_name)>1:
for img_idx in range(1,len(images_name)):
sequence.elements.append((images_name[img_idx]))
else:
seq = {}
for i in range(sequence_data['input_count']):
seq[f"seq{i+1}"] = sequence_editor.sequences_all.get(sequence_data.get(f"input_{i+1}", None))
sequence = sequence_editor.sequences.new_effect(name=strip_name,
type=strip_type,
channel=strip_channel,
frame_start=strip_frame_start,
frame_end=sequence_data['frame_final_end'],
**seq)
loader = Loader()
# TODO: Support filepath updates
loader.exclure_filter = ['filepath', 'sound', 'filenames','fps']
loader.load(sequence, sequence_data)
sequence.select = False
class BlScene(BlDatablock): class BlScene(BlDatablock):
bl_id = "scenes" bl_id = "scenes"
bl_class = bpy.types.Scene bl_class = bpy.types.Scene
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True bl_check_common = True
bl_icon = 'SCENE_DATA' bl_icon = 'SCENE_DATA'
bl_reload_parent = False bl_reload_parent = False
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.diff_method = DIFF_JSON
def _construct(self, data): def _construct(self, data):
instance = bpy.data.scenes.new(data["name"]) instance = bpy.data.scenes.new(data["name"])
instance.uuid = self.uuid
return instance return instance
def _load_implementation(self, data, target): def _load_implementation(self, data, target):
@ -328,6 +419,29 @@ class BlScene(BlDatablock):
'view_settings']['curve_mapping']['black_level'] 'view_settings']['curve_mapping']['black_level']
target.view_settings.curve_mapping.update() target.view_settings.curve_mapping.update()
# Sequencer
sequences = data.get('sequences')
if sequences:
# Create sequencer data
target.sequence_editor_create()
vse = target.sequence_editor
# Clear removed sequences
for seq in vse.sequences_all:
if seq.name not in sequences:
vse.sequences.remove(seq)
# Load existing sequences
for seq_name, seq_data in sequences.items():
load_sequence(seq_data, vse)
# If the sequence is no longer used, clear it
elif target.sequence_editor and not sequences:
target.sequence_editor_clear()
# FIXME: Find a better way after the replication big refacotoring
# Keep other user from deleting collection object by flushing their history
flush_history()
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
@ -386,10 +500,14 @@ class BlScene(BlDatablock):
data['view_settings']['curve_mapping']['curves'] = scene_dumper.dump( data['view_settings']['curve_mapping']['curves'] = scene_dumper.dump(
instance.view_settings.curve_mapping.curves) instance.view_settings.curve_mapping.curves)
if instance.sequence_editor: # Sequence
data['has_sequence'] = True vse = instance.sequence_editor
else: if vse:
data['has_sequence'] = False dumped_sequences = {}
for seq in vse.sequences_all:
dumped_sequences[seq.name] = dump_sequence(seq)
data['sequences'] = dumped_sequences
return data return data
@ -408,9 +526,18 @@ class BlScene(BlDatablock):
deps.append(self.instance.grease_pencil) deps.append(self.instance.grease_pencil)
# Sequences # Sequences
# deps.extend(list(self.instance.sequence_editor.sequences_all)) vse = self.instance.sequence_editor
if self.instance.sequence_editor: if vse:
deps.append(self.instance.sequence_editor) for sequence in vse.sequences_all:
if sequence.type == 'MOVIE' and sequence.filepath:
deps.append(Path(bpy.path.abspath(sequence.filepath)))
elif sequence.type == 'SOUND' and sequence.sound:
deps.append(sequence.sound)
elif sequence.type == 'IMAGE':
for elem in sequence.elements:
sequence.append(
Path(bpy.path.abspath(sequence.directory),
elem.filename))
return deps return deps

View File

@ -1,198 +0,0 @@
# ##### BEGIN GPL LICENSE BLOCK #####
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.
#
# ##### END GPL LICENSE BLOCK #####
import bpy
import mathutils
from pathlib import Path
import logging
from .bl_file import get_filepath
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid
def dump_sequence(sequence: bpy.types.Sequence) -> dict:
""" Dump a sequence to a dict
:arg sequence: sequence to dump
:type sequence: bpy.types.Sequence
:return dict:
"""
dumper = Dumper()
dumper.exclude_filter = [
'lock',
'select',
'select_left_handle',
'select_right_handle',
'strobe'
]
dumper.depth = 1
data = dumper.dump(sequence)
# TODO: Support multiple images
if sequence.type == 'IMAGE':
data['filenames'] = [e.filename for e in sequence.elements]
# Effect strip inputs
input_count = getattr(sequence, 'input_count', None)
if input_count:
for n in range(input_count):
input_name = f"input_{n+1}"
data[input_name] = getattr(sequence, input_name).name
return data
def load_sequence(sequence_data: dict, sequence_editor: bpy.types.SequenceEditor):
""" Load sequence from dumped data
:arg sequence_data: sequence to dump
:type sequence_data:dict
:arg sequence_editor: root sequence editor
:type sequence_editor: bpy.types.SequenceEditor
"""
strip_type = sequence_data.get('type')
strip_name = sequence_data.get('name')
strip_channel = sequence_data.get('channel')
strip_frame_start = sequence_data.get('frame_start')
sequence = sequence_editor.sequences_all.get(strip_name, None)
if sequence is None:
if strip_type == 'SCENE':
strip_scene = bpy.data.scenes.get(sequence_data.get('scene'))
sequence = sequence_editor.sequences.new_scene(strip_name,
strip_scene,
strip_channel,
strip_frame_start)
elif strip_type == 'MOVIE':
filepath = get_filepath(Path(sequence_data['filepath']).name)
sequence = sequence_editor.sequences.new_movie(strip_name,
filepath,
strip_channel,
strip_frame_start)
elif strip_type == 'SOUND':
filepath = bpy.data.sounds[sequence_data['sound']].filepath
sequence = sequence_editor.sequences.new_sound(strip_name,
filepath,
strip_channel,
strip_frame_start)
elif strip_type == 'IMAGE':
images_name = sequence_data.get('filenames')
filepath = get_filepath(images_name[0])
sequence = sequence_editor.sequences.new_image(strip_name,
filepath,
strip_channel,
strip_frame_start)
# load other images
if len(images_name)>1:
for img_idx in range(1,len(images_name)):
sequence.elements.append((images_name[img_idx]))
else:
seq = {}
for i in range(sequence_data['input_count']):
seq[f"seq{i+1}"] = sequence_editor.sequences_all.get(sequence_data.get(f"input_{i+1}", None))
sequence = sequence_editor.sequences.new_effect(name=strip_name,
type=strip_type,
channel=strip_channel,
frame_start=strip_frame_start,
frame_end=sequence_data['frame_final_end'],
**seq)
loader = Loader()
loader.load(sequence, sequence_data)
sequence.select = False
class BlSequencer(BlDatablock):
bl_id = "scenes"
bl_class = bpy.types.SequenceEditor
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True
bl_icon = 'SEQUENCE'
bl_reload_parent = False
def _construct(self, data):
# Get the scene
scene_id = data.get('name')
scene = bpy.data.scenes.get(scene_id, None)
# Create sequencer data
scene.sequence_editor_clear()
scene.sequence_editor_create()
return scene.sequence_editor
def resolve(self):
scene = bpy.data.scenes.get(self.data['name'], None)
if scene:
if scene.sequence_editor is None:
self.instance = self._construct(self.data)
else:
self.instance = scene.sequence_editor
else:
logging.warning("Sequencer editor scene not found")
def _load_implementation(self, data, target):
loader = Loader()
# Sequencer
sequences = data.get('sequences')
if sequences:
for seq in target.sequences_all:
if seq.name not in sequences:
target.sequences.remove(seq)
for seq_name, seq_data in sequences.items():
load_sequence(seq_data, target)
def _dump_implementation(self, data, instance=None):
assert(instance)
sequence_dumper = Dumper()
sequence_dumper.depth = 1
sequence_dumper.include_filter = [
'proxy_storage',
]
data = {}#sequence_dumper.dump(instance)
# Sequencer
sequences = {}
for seq in instance.sequences_all:
sequences[seq.name] = dump_sequence(seq)
data['sequences'] = sequences
data['name'] = instance.id_data.name
return data
def _resolve_deps_implementation(self):
deps = []
for seq in self.instance.sequences_all:
if seq.type == 'MOVIE' and seq.filepath:
deps.append(Path(bpy.path.abspath(seq.filepath)))
elif seq.type == 'SOUND' and seq.sound:
deps.append(seq.sound)
elif seq.type == 'IMAGE':
for e in seq.elements:
deps.append(Path(bpy.path.abspath(seq.directory), e.filename))
return deps

View File

@ -30,9 +30,6 @@ from .dump_anything import Dumper, Loader
class BlSound(BlDatablock): class BlSound(BlDatablock):
bl_id = "sounds" bl_id = "sounds"
bl_class = bpy.types.Sound bl_class = bpy.types.Sound
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'SOUND' bl_icon = 'SOUND'
bl_reload_parent = False bl_reload_parent = False

View File

@ -26,9 +26,6 @@ from .bl_datablock import BlDatablock
class BlSpeaker(BlDatablock): class BlSpeaker(BlDatablock):
bl_id = "speakers" bl_id = "speakers"
bl_class = bpy.types.Speaker bl_class = bpy.types.Speaker
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'SPEAKER' bl_icon = 'SPEAKER'
bl_reload_parent = False bl_reload_parent = False

View File

@ -26,9 +26,6 @@ from .bl_datablock import BlDatablock
class BlTexture(BlDatablock): class BlTexture(BlDatablock):
bl_id = "textures" bl_id = "textures"
bl_class = bpy.types.Texture bl_class = bpy.types.Texture
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'TEXTURE' bl_icon = 'TEXTURE'
bl_reload_parent = False bl_reload_parent = False

View File

@ -22,14 +22,11 @@ from pathlib import Path
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .bl_material import dump_materials_slots, load_materials_slots
class BlVolume(BlDatablock): class BlVolume(BlDatablock):
bl_id = "volumes" bl_id = "volumes"
bl_class = bpy.types.Volume bl_class = bpy.types.Volume
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = False bl_check_common = False
bl_icon = 'VOLUME_DATA' bl_icon = 'VOLUME_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -40,19 +37,9 @@ class BlVolume(BlDatablock):
loader.load(target.display, data['display']) loader.load(target.display, data['display'])
# MATERIAL SLOTS # MATERIAL SLOTS
target.materials.clear() src_materials = data.get('materials', None)
if src_materials:
for mat_uuid, mat_name in data["material_list"]: load_materials_slots(src_materials, target.materials)
mat_ref = None
if mat_uuid is not None:
mat_ref = get_datablock_from_uuid(mat_uuid, None)
else:
mat_ref = bpy.data.materials.get(mat_name, None)
if mat_ref is None:
raise Exception("Material doesn't exist")
target.materials.append(mat_ref)
def _construct(self, data): def _construct(self, data):
return bpy.data.volumes.new(data["name"]) return bpy.data.volumes.new(data["name"])
@ -78,7 +65,7 @@ class BlVolume(BlDatablock):
data['display'] = dumper.dump(instance.display) data['display'] = dumper.dump(instance.display)
# Fix material index # Fix material index
data['material_list'] = [(m.uuid, m.name) for m in instance.materials if m] data['materials'] = dump_materials_slots(instance.materials)
return data return data

View File

@ -21,17 +21,14 @@ import mathutils
from .dump_anything import Loader, Dumper from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock from .bl_datablock import BlDatablock
from .bl_material import (load_shader_node_tree, from .bl_material import (load_node_tree,
dump_shader_node_tree, dump_node_tree,
get_node_tree_dependencies) get_node_tree_dependencies)
class BlWorld(BlDatablock): class BlWorld(BlDatablock):
bl_id = "worlds" bl_id = "worlds"
bl_class = bpy.types.World bl_class = bpy.types.World
bl_delay_refresh = 1
bl_delay_apply = 1
bl_automatic_push = True
bl_check_common = True bl_check_common = True
bl_icon = 'WORLD_DATA' bl_icon = 'WORLD_DATA'
bl_reload_parent = False bl_reload_parent = False
@ -47,7 +44,7 @@ class BlWorld(BlDatablock):
if target.node_tree is None: if target.node_tree is None:
target.use_nodes = True target.use_nodes = True
load_shader_node_tree(data['node_tree'], target.node_tree) load_node_tree(data['node_tree'], target.node_tree)
def _dump_implementation(self, data, instance=None): def _dump_implementation(self, data, instance=None):
assert(instance) assert(instance)
@ -61,7 +58,7 @@ class BlWorld(BlDatablock):
] ]
data = world_dumper.dump(instance) data = world_dumper.dump(instance)
if instance.use_nodes: if instance.use_nodes:
data['node_tree'] = dump_shader_node_tree(instance.node_tree) data['node_tree'] = dump_node_tree(instance.node_tree)
return data return data

View File

@ -465,6 +465,7 @@ class Loader:
self.type_subset = self.match_subset_all self.type_subset = self.match_subset_all
self.occlude_read_only = False self.occlude_read_only = False
self.order = ['*'] self.order = ['*']
self.exclure_filter = []
def load(self, dst_data, src_dumped_data): def load(self, dst_data, src_dumped_data):
self._load_any( self._load_any(
@ -475,7 +476,8 @@ class Loader:
def _load_any(self, any, dump): def _load_any(self, any, dump):
for filter_function, load_function in self.type_subset: for filter_function, load_function in self.type_subset:
if filter_function(any): if filter_function(any) and \
any.sub_element_name not in self.exclure_filter:
load_function(any, dump) load_function(any, dump)
return return
@ -514,7 +516,7 @@ class Loader:
T.ColorRampElement: DESTRUCTOR_REMOVE, T.ColorRampElement: DESTRUCTOR_REMOVE,
T.Modifier: DESTRUCTOR_CLEAR, T.Modifier: DESTRUCTOR_CLEAR,
T.GpencilModifier: DESTRUCTOR_CLEAR, T.GpencilModifier: DESTRUCTOR_CLEAR,
T.Constraint: CONSTRUCTOR_NEW, T.Constraint: DESTRUCTOR_REMOVE,
} }
element_type = element.bl_rna_property.fixed_type element_type = element.bl_rna_property.fixed_type
@ -529,7 +531,13 @@ class Loader:
if destructor: if destructor:
if destructor == DESTRUCTOR_REMOVE: if destructor == DESTRUCTOR_REMOVE:
collection = element.read() collection = element.read()
for i in range(len(collection)-1): elems_to_remove = len(collection)
# Color ramp doesn't allow to remove all elements
if type(element_type) == T.ColorRampElement:
elems_to_remove -= 1
for i in range(elems_to_remove):
collection.remove(collection[0]) collection.remove(collection[0])
else: else:
getattr(element.read(), DESTRUCTOR_CLEAR)() getattr(element.read(), DESTRUCTOR_CLEAR)()
@ -588,6 +596,8 @@ class Loader:
instance.write(bpy.data.textures.get(dump)) instance.write(bpy.data.textures.get(dump))
elif isinstance(rna_property_type, T.ColorRamp): elif isinstance(rna_property_type, T.ColorRamp):
self._load_default(instance, dump) self._load_default(instance, dump)
elif isinstance(rna_property_type, T.NodeTree):
instance.write(bpy.data.node_groups.get(dump))
elif isinstance(rna_property_type, T.Object): elif isinstance(rna_property_type, T.Object):
instance.write(bpy.data.objects.get(dump)) instance.write(bpy.data.objects.get(dump))
elif isinstance(rna_property_type, T.Mesh): elif isinstance(rna_property_type, T.Mesh):
@ -600,6 +610,8 @@ class Loader:
instance.write(bpy.data.fonts.get(dump)) instance.write(bpy.data.fonts.get(dump))
elif isinstance(rna_property_type, T.Sound): elif isinstance(rna_property_type, T.Sound):
instance.write(bpy.data.sounds.get(dump)) instance.write(bpy.data.sounds.get(dump))
# elif isinstance(rna_property_type, T.ParticleSettings):
# instance.write(bpy.data.particles.get(dump))
def _load_matrix(self, matrix, dump): def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump)) matrix.write(mathutils.Matrix(dump))

View File

@ -17,6 +17,8 @@
import asyncio import asyncio
import copy
import gzip
import logging import logging
import os import os
import queue import queue
@ -25,27 +27,35 @@ import shutil
import string import string
import sys import sys
import time import time
from datetime import datetime
from operator import itemgetter from operator import itemgetter
from pathlib import Path from pathlib import Path
from queue import Queue from queue import Queue
from time import gmtime, strftime
try:
import _pickle as pickle
except ImportError:
import pickle
import bpy import bpy
import mathutils import mathutils
from bpy.app.handlers import persistent from bpy.app.handlers import persistent
from replication.constants import (FETCHED, RP_COMMON, STATE_ACTIVE, from bpy_extras.io_utils import ExportHelper, ImportHelper
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
STATE_INITIAL, STATE_SYNCING, UP) STATE_INITIAL, STATE_SYNCING, UP)
from replication.data import ReplicatedDataFactory from replication.data import ReplicatedDataFactory
from replication.exception import NonAuthorizedOperationError from replication.exception import NonAuthorizedOperationError, ContextError
from replication.interface import session from replication.interface import session
from . import bl_types, delayable, environment, ui, utils from . import bl_types, environment, timers, ui, utils
from .presence import SessionStatusWidget, renderer, view3d_find from .presence import SessionStatusWidget, renderer, view3d_find
from .timers import registry
background_execution_queue = Queue() background_execution_queue = Queue()
deleyables = [] deleyables = []
stop_modal_executor = False stop_modal_executor = False
def session_callback(name): def session_callback(name):
""" Session callback wrapper """ Session callback wrapper
@ -64,30 +74,42 @@ def session_callback(name):
def initialize_session(): def initialize_session():
"""Session connection init hander """Session connection init hander
""" """
logging.info("Intializing the scene")
settings = utils.get_preferences() settings = utils.get_preferences()
runtime_settings = bpy.context.window_manager.session runtime_settings = bpy.context.window_manager.session
# Step 1: Constrect nodes # Step 1: Constrect nodes
logging.info("Constructing nodes")
for node in session._graph.list_ordered(): for node in session._graph.list_ordered():
node_ref = session.get(node) node_ref = session.get(uuid=node)
if node_ref.state == FETCHED: if node_ref is None:
logging.error(f"Can't construct node {node}")
elif node_ref.state == FETCHED:
node_ref.resolve() node_ref.resolve()
# Step 2: Load nodes # Step 2: Load nodes
logging.info("Loading nodes")
for node in session._graph.list_ordered(): for node in session._graph.list_ordered():
node_ref = session.get(node) node_ref = session.get(uuid=node)
if node_ref.state == FETCHED:
if node_ref is None:
logging.error(f"Can't load node {node}")
elif node_ref.state == FETCHED:
node_ref.apply() node_ref.apply()
logging.info("Registering timers")
# Step 4: Register blender timers # Step 4: Register blender timers
for d in deleyables: for d in deleyables:
d.register() d.register()
if settings.update_method == 'DEPSGRAPH':
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
bpy.ops.session.apply_armature_operator('INVOKE_DEFAULT') bpy.ops.session.apply_armature_operator('INVOKE_DEFAULT')
# Step 5: Clearing history
utils.flush_history()
# Step 6: Launch deps graph update handling
bpy.app.handlers.depsgraph_update_post.append(depsgraph_evaluation)
@session_callback('on_exit') @session_callback('on_exit')
def on_connection_end(reason="none"): def on_connection_end(reason="none"):
@ -106,9 +128,8 @@ def on_connection_end(reason="none"):
stop_modal_executor = True stop_modal_executor = True
if settings.update_method == 'DEPSGRAPH': if depsgraph_evaluation in bpy.app.handlers.depsgraph_update_post:
bpy.app.handlers.depsgraph_update_post.remove( bpy.app.handlers.depsgraph_update_post.remove(depsgraph_evaluation)
depsgraph_evaluation)
# Step 3: remove file handled # Step 3: remove file handled
logger = logging.getLogger() logger = logging.getLogger()
@ -138,7 +159,7 @@ class SessionStartOperator(bpy.types.Operator):
runtime_settings = context.window_manager.session runtime_settings = context.window_manager.session
users = bpy.data.window_managers['WinMan'].online_users users = bpy.data.window_managers['WinMan'].online_users
admin_pass = runtime_settings.password admin_pass = runtime_settings.password
use_extern_update = settings.update_method == 'DEPSGRAPH'
users.clear() users.clear()
deleyables.clear() deleyables.clear()
@ -149,9 +170,10 @@ class SessionStartOperator(bpy.types.Operator):
datefmt='%H:%M:%S' datefmt='%H:%M:%S'
) )
start_time = datetime.now().strftime('%Y_%m_%d_%H-%M-%S')
log_directory = os.path.join( log_directory = os.path.join(
settings.cache_directory, settings.cache_directory,
"multiuser_client.log") f"multiuser_{start_time}.log")
os.makedirs(settings.cache_directory, exist_ok=True) os.makedirs(settings.cache_directory, exist_ok=True)
@ -186,17 +208,8 @@ class SessionStartOperator(bpy.types.Operator):
bpy_factory.register_type( bpy_factory.register_type(
type_module_class.bl_class, type_module_class.bl_class,
type_module_class, type_module_class,
timer=type_local_config.bl_delay_refresh*1000,
automatic=type_local_config.auto_push,
check_common=type_module_class.bl_check_common) check_common=type_module_class.bl_check_common)
if settings.update_method == 'DEFAULT':
if type_local_config.bl_delay_apply > 0:
deleyables.append(
delayable.ApplyTimer(
timout=type_local_config.bl_delay_apply,
target_type=type_module_class))
if bpy.app.version[1] >= 91: if bpy.app.version[1] >= 91:
python_binary_path = sys.executable python_binary_path = sys.executable
else: else:
@ -205,11 +218,7 @@ class SessionStartOperator(bpy.types.Operator):
session.configure( session.configure(
factory=bpy_factory, factory=bpy_factory,
python_path=python_binary_path, python_path=python_binary_path,
external_update_handling=use_extern_update) external_update_handling=True)
if settings.update_method == 'DEPSGRAPH':
deleyables.append(delayable.ApplyTimer(
settings.depsgraph_update_rate/1000))
# Host a session # Host a session
if self.host: if self.host:
@ -259,12 +268,16 @@ class SessionStartOperator(bpy.types.Operator):
logging.error(str(e)) logging.error(str(e))
# Background client updates service # Background client updates service
deleyables.append(delayable.ClientUpdate()) deleyables.append(timers.ClientUpdate())
deleyables.append(delayable.DynamicRightSelectTimer()) deleyables.append(timers.DynamicRightSelectTimer())
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
session_update = delayable.SessionStatusUpdate() # deleyables.append(timers.PushTimer(
session_user_sync = delayable.SessionUserSync() # queue=stagging,
session_background_executor = delayable.MainThreadExecutor( # timeout=settings.depsgraph_update_rate
# ))
session_update = timers.SessionStatusUpdate()
session_user_sync = timers.SessionUserSync()
session_background_executor = timers.MainThreadExecutor(
execution_queue=background_execution_queue) execution_queue=background_execution_queue)
session_update.register() session_update.register()
@ -586,9 +599,14 @@ class SessionApply(bpy.types.Operator):
def execute(self, context): def execute(self, context):
logging.debug(f"Running apply on {self.target}") logging.debug(f"Running apply on {self.target}")
try: try:
node_ref = session.get(uuid=self.target)
session.apply(self.target, session.apply(self.target,
force=True, force=True,
force_dependencies=self.reset_dependencies) force_dependencies=self.reset_dependencies)
if node_ref.bl_reload_parent:
for parent in session._graph.find_parents(self.target):
logging.debug(f"Refresh parent {parent}")
session.apply(parent, force=True)
except Exception as e: except Exception as e:
self.report({'ERROR'}, repr(e)) self.report({'ERROR'}, repr(e))
return {"CANCELED"} return {"CANCELED"}
@ -688,6 +706,31 @@ class SessionClearCache(bpy.types.Operator):
row = self.layout row = self.layout
row.label(text=f" Do you really want to remove local cache ? ") row.label(text=f" Do you really want to remove local cache ? ")
class SessionPurgeOperator(bpy.types.Operator):
"Remove node with lost references"
bl_idname = "session.purge"
bl_label = "Purge session data"
@classmethod
def poll(cls, context):
return True
def execute(self, context):
try:
sanitize_deps_graph(remove_nodes=True)
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"FINISHED"}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
row = self.layout
row.label(text=f" Do you really want to remove local cache ? ")
class SessionNotifyOperator(bpy.types.Operator): class SessionNotifyOperator(bpy.types.Operator):
"""Dialog only operator""" """Dialog only operator"""
bl_idname = "session.notify" bl_idname = "session.notify"
@ -712,6 +755,150 @@ class SessionNotifyOperator(bpy.types.Operator):
return context.window_manager.invoke_props_dialog(self) return context.window_manager.invoke_props_dialog(self)
class SessionSaveBackupOperator(bpy.types.Operator, ExportHelper):
bl_idname = "session.save"
bl_label = "Save session data"
bl_description = "Save a snapshot of the collaborative session"
# ExportHelper mixin class uses this
filename_ext = ".db"
filter_glob: bpy.props.StringProperty(
default="*.db",
options={'HIDDEN'},
maxlen=255, # Max internal buffer length, longer would be clamped.
)
enable_autosave: bpy.props.BoolProperty(
name="Auto-save",
description="Enable session auto-save",
default=True,
)
save_interval: bpy.props.FloatProperty(
name="Auto save interval",
description="auto-save interval (seconds)",
default=10,
)
def execute(self, context):
if self.enable_autosave:
recorder = timers.SessionBackupTimer(
filepath=self.filepath,
timeout=self.save_interval)
recorder.register()
deleyables.append(recorder)
else:
session.save(self.filepath)
return {'FINISHED'}
@classmethod
def poll(cls, context):
return session.state['STATE'] == STATE_ACTIVE
class SessionStopAutoSaveOperator(bpy.types.Operator):
bl_idname = "session.cancel_autosave"
bl_label = "Cancel auto-save"
bl_description = "Cancel session auto-save"
@classmethod
def poll(cls, context):
return (session.state['STATE'] == STATE_ACTIVE and 'SessionBackupTimer' in registry)
def execute(self, context):
autosave_timer = registry.get('SessionBackupTimer')
autosave_timer.unregister()
return {'FINISHED'}
class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
bl_idname = "session.load"
bl_label = "Load session save"
bl_description = "Load a Multi-user session save"
bl_options = {'REGISTER', 'UNDO'}
# ExportHelper mixin class uses this
filename_ext = ".db"
filter_glob: bpy.props.StringProperty(
default="*.db",
options={'HIDDEN'},
maxlen=255, # Max internal buffer length, longer would be clamped.
)
def execute(self, context):
from replication.graph import ReplicationGraph
# TODO: add filechecks
try:
f = gzip.open(self.filepath, "rb")
db = pickle.load(f)
except OSError as e:
f = open(self.filepath, "rb")
db = pickle.load(f)
if db:
logging.info(f"Reading {self.filepath}")
nodes = db.get("nodes")
logging.info(f"{len(nodes)} Nodes to load")
# init the factory with supported types
bpy_factory = ReplicatedDataFactory()
for type in bl_types.types_to_register():
type_module = getattr(bl_types, type)
name = [e.capitalize() for e in type.split('_')[1:]]
type_impl_name = 'Bl'+''.join(name)
type_module_class = getattr(type_module, type_impl_name)
bpy_factory.register_type(
type_module_class.bl_class,
type_module_class)
graph = ReplicationGraph()
for node, node_data in nodes:
node_type = node_data.get('str_type')
impl = bpy_factory.get_implementation_from_net(node_type)
if impl:
logging.info(f"Loading {node}")
instance = impl(owner=node_data['owner'],
uuid=node,
dependencies=node_data['dependencies'],
data=node_data['data'])
instance.store(graph)
instance.state = FETCHED
logging.info("Graph succefully loaded")
utils.clean_scene()
# Step 1: Construct nodes
for node in graph.list_ordered():
graph[node].resolve()
# Step 2: Load nodes
for node in graph.list_ordered():
graph[node].apply()
return {'FINISHED'}
@classmethod
def poll(cls, context):
return True
def menu_func_import(self, context):
self.layout.operator(SessionLoadSaveOperator.bl_idname, text='Multi-user session snapshot (.db)')
classes = ( classes = (
SessionStartOperator, SessionStartOperator,
SessionStopOperator, SessionStopOperator,
@ -726,21 +913,51 @@ classes = (
SessionInitOperator, SessionInitOperator,
SessionClearCache, SessionClearCache,
SessionNotifyOperator, SessionNotifyOperator,
SessionSaveBackupOperator,
SessionLoadSaveOperator,
SessionStopAutoSaveOperator,
SessionPurgeOperator,
) )
def update_external_dependencies():
nodes_ids = session.list(filter=bl_types.bl_file.BlFile)
for node_id in nodes_ids:
node = session.get(node_id)
if node and node.owner in [session.id, RP_COMMON] \
and node.has_changed():
session.commit(node_id)
session.push(node_id, check_data=False)
def sanitize_deps_graph(remove_nodes: bool = False):
""" Cleanup the replication graph
"""
if session and session.state['STATE'] == STATE_ACTIVE:
start = utils.current_milli_time()
rm_cpt = 0
for node_key in session.list():
node = session.get(node_key)
if node is None \
or (node.state == UP and not node.resolve(construct=False)):
if remove_nodes:
try:
session.remove(node.uuid, remove_dependencies=False)
logging.info(f"Removing {node.uuid}")
rm_cpt += 1
except NonAuthorizedOperationError:
continue
logging.info(f"Sanitize took { utils.current_milli_time()-start} ms")
@persistent @persistent
def sanitize_deps_graph(dummy): def resolve_deps_graph(dummy):
"""sanitize deps graph """Resolve deps graph
Temporary solution to resolve each node pointers after a Undo. Temporary solution to resolve each node pointers after a Undo.
A future solution should be to avoid storing dataclock reference... A future solution should be to avoid storing dataclock reference...
""" """
if session and session.state['STATE'] == STATE_ACTIVE: if session and session.state['STATE'] == STATE_ACTIVE:
for node_key in session.list(): sanitize_deps_graph(remove_nodes=True)
session.get(node_key).resolve()
@persistent @persistent
def load_pre_handler(dummy): def load_pre_handler(dummy):
@ -764,41 +981,53 @@ def depsgraph_evaluation(scene):
dependency_updates = [u for u in blender_depsgraph.updates] dependency_updates = [u for u in blender_depsgraph.updates]
settings = utils.get_preferences() settings = utils.get_preferences()
# NOTE: maybe we don't need to check each update but only the first update_external_dependencies()
# NOTE: maybe we don't need to check each update but only the first
for update in reversed(dependency_updates): for update in reversed(dependency_updates):
# Is the object tracked ? # Is the object tracked ?
if update.id.uuid: if update.id.uuid:
# Retrieve local version # Retrieve local version
node = session.get(update.id.uuid) node = session.get(uuid=update.id.uuid)
# Check our right on this update: # Check our right on this update:
# - if its ours or ( under common and diff), launch the # - if its ours or ( under common and diff), launch the
# update process # update process
# - if its to someone else, ignore the update (go deeper ?) # - if its to someone else, ignore the update
if node and node.owner in [session.id, RP_COMMON] and node.state == UP: if node and (node.owner == session.id or node.bl_check_common):
# Avoid slow geometry update if node.state == UP:
if 'EDIT' in context.mode and \ try:
not settings.sync_flags.sync_during_editmode: if node.has_changed():
break session.commit(node.uuid)
session.push(node.uuid, check_data=False)
session.stash(node.uuid) except ReferenceError:
logging.debug(f"Reference error {node.uuid}")
if not node.is_valid():
session.remove(node.uuid)
except ContextError as e:
logging.debug(e)
except Exception as e:
logging.error(e)
else: else:
# Distant update
continue continue
# else: # A new scene is created
# # New items ! elif isinstance(update.id, bpy.types.Scene):
# logger.error("UPDATE: ADD") ref = session.get(reference=update.id)
if ref:
ref.resolve()
else:
scn_uuid = session.add(update.id)
session.commit(scn_uuid)
session.push(scn_uuid, check_data=False)
def register(): def register():
from bpy.utils import register_class from bpy.utils import register_class
for cls in classes: for cls in classes:
register_class(cls) register_class(cls)
bpy.app.handlers.undo_post.append(sanitize_deps_graph)
bpy.app.handlers.redo_post.append(sanitize_deps_graph) bpy.app.handlers.undo_post.append(resolve_deps_graph)
bpy.app.handlers.redo_post.append(resolve_deps_graph)
bpy.app.handlers.load_pre.append(load_pre_handler) bpy.app.handlers.load_pre.append(load_pre_handler)
bpy.app.handlers.frame_change_pre.append(update_client_frame) bpy.app.handlers.frame_change_pre.append(update_client_frame)
@ -812,8 +1041,8 @@ def unregister():
for cls in reversed(classes): for cls in reversed(classes):
unregister_class(cls) unregister_class(cls)
bpy.app.handlers.undo_post.remove(sanitize_deps_graph) bpy.app.handlers.undo_post.remove(resolve_deps_graph)
bpy.app.handlers.redo_post.remove(sanitize_deps_graph) bpy.app.handlers.redo_post.remove(resolve_deps_graph)
bpy.app.handlers.load_pre.remove(load_pre_handler) bpy.app.handlers.load_pre.remove(load_pre_handler)
bpy.app.handlers.frame_change_pre.remove(update_client_frame) bpy.app.handlers.frame_change_pre.remove(update_client_frame)

View File

@ -97,8 +97,6 @@ def get_log_level(self):
class ReplicatedDatablock(bpy.types.PropertyGroup): class ReplicatedDatablock(bpy.types.PropertyGroup):
type_name: bpy.props.StringProperty() type_name: bpy.props.StringProperty()
bl_name: bpy.props.StringProperty() bl_name: bpy.props.StringProperty()
bl_delay_refresh: bpy.props.FloatProperty()
bl_delay_apply: bpy.props.FloatProperty()
use_as_filter: bpy.props.BoolProperty(default=True) use_as_filter: bpy.props.BoolProperty(default=True)
auto_push: bpy.props.BoolProperty(default=True) auto_push: bpy.props.BoolProperty(default=True)
icon: bpy.props.StringProperty() icon: bpy.props.StringProperty()
@ -197,22 +195,13 @@ class SessionPrefs(bpy.types.AddonPreferences):
connection_timeout: bpy.props.IntProperty( connection_timeout: bpy.props.IntProperty(
name='connection timeout', name='connection timeout',
description='connection timeout before disconnection', description='connection timeout before disconnection',
default=1000 default=5000
)
update_method: bpy.props.EnumProperty(
name='update method',
description='replication update method',
items=[
('DEFAULT', "Default", "Default: Use threads to monitor databloc changes"),
('DEPSGRAPH', "Depsgraph",
"Experimental: Use the blender dependency graph to trigger updates"),
],
) )
# Replication update settings # Replication update settings
depsgraph_update_rate: bpy.props.IntProperty( depsgraph_update_rate: bpy.props.FloatProperty(
name='depsgraph update rate', name='depsgraph update rate (s)',
description='Dependency graph uppdate rate (milliseconds)', description='Dependency graph uppdate rate (s)',
default=1000 default=1
) )
clear_memory_filecache: bpy.props.BoolProperty( clear_memory_filecache: bpy.props.BoolProperty(
name="Clear memory filecache", name="Clear memory filecache",
@ -282,11 +271,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
description="Rights", description="Rights",
default=False default=False
) )
conf_session_timing_expanded: bpy.props.BoolProperty(
name="timings",
description="timings",
default=False
)
conf_session_cache_expanded: bpy.props.BoolProperty( conf_session_cache_expanded: bpy.props.BoolProperty(
name="Cache", name="Cache",
description="cache", description="cache",
@ -390,28 +374,7 @@ class SessionPrefs(bpy.types.AddonPreferences):
row = box.row() row = box.row()
row.label(text="Init the session from:") row.label(text="Init the session from:")
row.prop(self, "init_method", text="") row.prop(self, "init_method", text="")
row = box.row()
row.label(text="Update method:")
row.prop(self, "update_method", text="")
table = box.box()
table.row().prop(
self, "conf_session_timing_expanded", text="Refresh rates",
icon=get_expanded_icon(self.conf_session_timing_expanded),
emboss=False)
if self.conf_session_timing_expanded:
line = table.row()
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
for item in self.supported_datablocks:
line = table.row(align=True)
line.label(text="", icon=item.icon)
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
# HOST SETTINGS # HOST SETTINGS
box = grid.box() box = grid.box()
box.prop( box.prop(
@ -467,11 +430,8 @@ class SessionPrefs(bpy.types.AddonPreferences):
type_module_class = getattr(type_module, type_impl_name) type_module_class = getattr(type_module, type_impl_name)
new_db.name = type_impl_name new_db.name = type_impl_name
new_db.type_name = type_impl_name new_db.type_name = type_impl_name
new_db.bl_delay_refresh = type_module_class.bl_delay_refresh
new_db.bl_delay_apply = type_module_class.bl_delay_apply
new_db.use_as_filter = True new_db.use_as_filter = True
new_db.icon = type_module_class.bl_icon new_db.icon = type_module_class.bl_icon
new_db.auto_push = type_module_class.bl_automatic_push
new_db.bl_name = type_module_class.bl_id new_db.bl_name = type_module_class.bl_id

View File

@ -16,68 +16,48 @@
# ##### END GPL LICENSE BLOCK ##### # ##### END GPL LICENSE BLOCK #####
import logging import logging
import sys
import traceback
import bpy import bpy
from replication.constants import (FETCHED, RP_COMMON, STATE_ACTIVE,
from . import utils STATE_INITIAL, STATE_LOBBY, STATE_QUITTING,
from .presence import (renderer, STATE_SRV_SYNC, STATE_SYNCING, UP)
UserFrustumWidget, from replication.exception import NonAuthorizedOperationError, ContextError
UserNameWidget,
UserSelectionWidget,
refresh_3d_view,
generate_user_camera,
get_view_matrix,
refresh_sidebar_view)
from . import operators
from replication.constants import (FETCHED,
UP,
RP_COMMON,
STATE_INITIAL,
STATE_QUITTING,
STATE_ACTIVE,
STATE_SYNCING,
STATE_LOBBY,
STATE_SRV_SYNC)
from replication.interface import session from replication.interface import session
from replication.exception import NonAuthorizedOperationError
from . import operators, utils
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
generate_user_camera, get_view_matrix, refresh_3d_view,
refresh_sidebar_view, renderer)
this = sys.modules[__name__]
# Registered timers
this.registry = dict()
def is_annotating(context: bpy.types.Context): def is_annotating(context: bpy.types.Context):
""" Check if the annotate mode is enabled """ Check if the annotate mode is enabled
""" """
return bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False).idname == 'builtin.annotate' return bpy.context.workspace.tools.from_space_view3d_mode('OBJECT', create=False).idname == 'builtin.annotate'
class Delayable():
"""Delayable task interface
"""
def register(self): class Timer(object):
raise NotImplementedError
def execute(self):
raise NotImplementedError
def unregister(self):
raise NotImplementedError
class Timer(Delayable):
"""Timer binder interface for blender """Timer binder interface for blender
Run a bpy.app.Timer in the background looping at the given rate Run a bpy.app.Timer in the background looping at the given rate
""" """
def __init__(self, duration=1): def __init__(self, timeout=10, id=None):
super().__init__() self._timeout = timeout
self._timeout = duration
self.is_running = False self.is_running = False
self.id = id if id else self.__class__.__name__
def register(self): def register(self):
"""Register the timer into the blender timer system """Register the timer into the blender timer system
""" """
if not self.is_running: if not self.is_running:
this.registry[self.id] = self
bpy.app.timers.register(self.main) bpy.app.timers.register(self.main)
self.is_running = True self.is_running = True
logging.debug(f"Register {self.__class__.__name__}") logging.debug(f"Register {self.__class__.__name__}")
@ -105,21 +85,24 @@ class Timer(Delayable):
"""Unnegister the timer of the blender timer system """Unnegister the timer of the blender timer system
""" """
if bpy.app.timers.is_registered(self.main): if bpy.app.timers.is_registered(self.main):
logging.info(f"Unregistering {self.id}")
bpy.app.timers.unregister(self.main) bpy.app.timers.unregister(self.main)
del this.registry[self.id]
self.is_running = False self.is_running = False
class SessionBackupTimer(Timer):
def __init__(self, timeout=10, filepath=None):
self._filepath = filepath
super().__init__(timeout)
class ApplyTimer(Timer):
def __init__(self, timout=1, target_type=None):
self._type = target_type
super().__init__(timout)
def execute(self):
session.save(self._filepath)
class ApplyTimer(Timer):
def execute(self): def execute(self):
if session and session.state['STATE'] == STATE_ACTIVE: if session and session.state['STATE'] == STATE_ACTIVE:
if self._type:
nodes = session.list(filter=self._type)
else:
nodes = session.list() nodes = session.list()
for node in nodes: for node in nodes:
@ -129,19 +112,18 @@ class ApplyTimer(Timer):
try: try:
session.apply(node) session.apply(node)
except Exception as e: except Exception as e:
logging.error(f"Fail to apply {node_ref.uuid}: {e}") logging.error(f"Fail to apply {node_ref.uuid}")
traceback.print_exc()
else: else:
if self._type.bl_reload_parent: if node_ref.bl_reload_parent:
parents = [] for parent in session._graph.find_parents(node):
logging.debug("Refresh parent {node}")
session.apply(parent, force=True)
for n in session.list():
deps = session.get(uuid=n).dependencies
if deps and node in deps:
session.apply(n, force=True)
class DynamicRightSelectTimer(Timer): class DynamicRightSelectTimer(Timer):
def __init__(self, timout=.1): def __init__(self, timeout=.1):
super().__init__(timout) super().__init__(timeout)
self._last_selection = [] self._last_selection = []
self._user = None self._user = None
self._annotating = False self._annotating = False
@ -158,6 +140,9 @@ class DynamicRightSelectTimer(Timer):
ctx = bpy.context ctx = bpy.context
annotation_gp = ctx.scene.grease_pencil annotation_gp = ctx.scene.grease_pencil
if annotation_gp and not annotation_gp.uuid:
ctx.scene.update_tag()
# if an annotation exist and is tracked # if an annotation exist and is tracked
if annotation_gp and annotation_gp.uuid: if annotation_gp and annotation_gp.uuid:
registered_gp = session.get(uuid=annotation_gp.uuid) registered_gp = session.get(uuid=annotation_gp.uuid)
@ -172,6 +157,13 @@ class DynamicRightSelectTimer(Timer):
settings.username, settings.username,
ignore_warnings=True, ignore_warnings=True,
affect_dependencies=False) affect_dependencies=False)
if registered_gp.owner == settings.username:
gp_node = session.get(uuid=annotation_gp.uuid)
if gp_node.has_changed():
session.commit(gp_node.uuid)
session.push(gp_node.uuid, check_data=False)
elif self._annotating: elif self._annotating:
session.change_owner( session.change_owner(
registered_gp.uuid, registered_gp.uuid,
@ -262,8 +254,8 @@ class DynamicRightSelectTimer(Timer):
class ClientUpdate(Timer): class ClientUpdate(Timer):
def __init__(self, timout=.1): def __init__(self, timeout=.1):
super().__init__(timout) super().__init__(timeout)
self.handle_quit = False self.handle_quit = False
self.users_metadata = {} self.users_metadata = {}
@ -325,16 +317,16 @@ class ClientUpdate(Timer):
class SessionStatusUpdate(Timer): class SessionStatusUpdate(Timer):
def __init__(self, timout=1): def __init__(self, timeout=1):
super().__init__(timout) super().__init__(timeout)
def execute(self): def execute(self):
refresh_sidebar_view() refresh_sidebar_view()
class SessionUserSync(Timer): class SessionUserSync(Timer):
def __init__(self, timout=1): def __init__(self, timeout=1):
super().__init__(timout) super().__init__(timeout)
self.settings = utils.get_preferences() self.settings = utils.get_preferences()
def execute(self): def execute(self):
@ -367,8 +359,8 @@ class SessionUserSync(Timer):
class MainThreadExecutor(Timer): class MainThreadExecutor(Timer):
def __init__(self, timout=1, execution_queue=None): def __init__(self, timeout=1, execution_queue=None):
super().__init__(timout) super().__init__(timeout)
self.execution_queue = execution_queue self.execution_queue = execution_queue
def execute(self): def execute(self):

View File

@ -29,6 +29,7 @@ from replication.constants import (ADDED, ERROR, FETCHED,
STATE_LAUNCHING_SERVICES) STATE_LAUNCHING_SERVICES)
from replication import __version__ from replication import __version__
from replication.interface import session from replication.interface import session
from .timers import registry
ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED ICONS_PROP_STATES = ['TRIA_DOWN', # ADDED
'TRIA_UP', # COMMITED 'TRIA_UP', # COMMITED
@ -268,7 +269,6 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
if settings.sidebar_advanced_rep_expanded: if settings.sidebar_advanced_rep_expanded:
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
replication_section_row.label(text="Sync flags", icon='COLLECTION_NEW')
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
replication_section_row.prop(settings.sync_flags, "sync_render_settings") replication_section_row.prop(settings.sync_flags, "sync_render_settings")
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
@ -281,34 +281,8 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
warning = replication_section_row.box() warning = replication_section_row.box()
warning.label(text="Don't use this with heavy meshes !", icon='ERROR') warning.label(text="Don't use this with heavy meshes !", icon='ERROR')
replication_section_row = replication_section.row() replication_section_row = replication_section.row()
replication_section_row.prop(settings, "depsgraph_update_rate", text="Apply delay")
replication_section_row.label(text="Update method", icon='RECOVER_LAST')
replication_section_row = replication_section.row()
replication_section_row.prop(settings, "update_method", expand=True)
replication_section_row = replication_section.row()
replication_timers = replication_section_row.box()
replication_timers.label(text="Replication timers", icon='TIME')
if settings.update_method == "DEFAULT":
replication_timers = replication_timers.row()
# Replication frequencies
flow = replication_timers.grid_flow(
row_major=True, columns=0, even_columns=True, even_rows=False, align=True)
line = flow.row(align=True)
line.label(text=" ")
line.separator()
line.label(text="refresh (sec)")
line.label(text="apply (sec)")
for item in settings.supported_datablocks:
line = flow.row(align=True)
line.prop(item, "auto_push", text="", icon=item.icon)
line.separator()
line.prop(item, "bl_delay_refresh", text="")
line.prop(item, "bl_delay_apply", text="")
else:
replication_timers = replication_timers.row()
replication_timers.label(text="Update rate (ms):")
replication_timers.prop(settings, "depsgraph_update_rate", text="")
cache_section = layout.row().box() cache_section = layout.row().box()
cache_section.prop( cache_section.prop(
@ -563,6 +537,13 @@ class SESSION_PT_repository(bpy.types.Panel):
row = layout.row() row = layout.row()
if session.state['STATE'] == STATE_ACTIVE: if session.state['STATE'] == STATE_ACTIVE:
if 'SessionBackupTimer' in registry:
row.alert = True
row.operator('session.cancel_autosave', icon="CANCEL")
row.alert = False
else:
row.operator('session.save', icon="FILE_TICK")
flow = layout.grid_flow( flow = layout.grid_flow(
row_major=True, row_major=True,
columns=0, columns=0,

View File

@ -65,6 +65,15 @@ def get_datablock_users(datablock):
return users return users
def flush_history():
try:
logging.debug("Flushing history")
for i in range(bpy.context.preferences.edit.undo_steps+1):
bpy.ops.ed.undo_push(message="Multiuser history flush")
except RuntimeError:
logging.error("Fail to overwrite history")
def get_state_str(state): def get_state_str(state):
state_str = 'UNKOWN' state_str = 'UNKOWN'
if state == STATE_WAITING: if state == STATE_WAITING:

View File

@ -13,7 +13,7 @@ def main():
if len(sys.argv) > 2: if len(sys.argv) > 2:
blender_rev = sys.argv[2] blender_rev = sys.argv[2]
else: else:
blender_rev = "2.91.0" blender_rev = "2.92.0"
try: try:
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev) exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)

View File

@ -7,7 +7,7 @@ import bpy
import random import random
from multi_user.bl_types.bl_object import BlObject from multi_user.bl_types.bl_object import BlObject
# Removed 'BUILD' modifier because the seed doesn't seems to be # Removed 'BUILD', 'SOFT_BODY' modifier because the seed doesn't seems to be
# correctly initialized (#TODO: report the bug) # correctly initialized (#TODO: report the bug)
MOFIFIERS_TYPES = [ MOFIFIERS_TYPES = [
'DATA_TRANSFER', 'MESH_CACHE', 'MESH_SEQUENCE_CACHE', 'DATA_TRANSFER', 'MESH_CACHE', 'MESH_SEQUENCE_CACHE',
@ -22,8 +22,7 @@ MOFIFIERS_TYPES = [
'MESH_DEFORM', 'SHRINKWRAP', 'SIMPLE_DEFORM', 'SMOOTH', 'MESH_DEFORM', 'SHRINKWRAP', 'SIMPLE_DEFORM', 'SMOOTH',
'CORRECTIVE_SMOOTH', 'LAPLACIANSMOOTH', 'SURFACE_DEFORM', 'CORRECTIVE_SMOOTH', 'LAPLACIANSMOOTH', 'SURFACE_DEFORM',
'WARP', 'WAVE', 'CLOTH', 'COLLISION', 'DYNAMIC_PAINT', 'WARP', 'WAVE', 'CLOTH', 'COLLISION', 'DYNAMIC_PAINT',
'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE', 'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE', 'SURFACE']
'SOFT_BODY', 'SURFACE']
GP_MODIFIERS_TYPE = [ GP_MODIFIERS_TYPE = [
'GP_ARRAY', 'GP_BUILD', 'GP_MIRROR', 'GP_MULTIPLY', 'GP_ARRAY', 'GP_BUILD', 'GP_MIRROR', 'GP_MULTIPLY',
@ -72,5 +71,5 @@ def test_object(clear_blend):
test = implementation._construct(expected) test = implementation._construct(expected)
implementation._load(expected, test) implementation._load(expected, test)
result = implementation._dump(test) result = implementation._dump(test)
print(DeepDiff(expected, result))
assert not DeepDiff(expected, result) assert not DeepDiff(expected, result)