Merge branch '49-connection-preset-system' of https://gitlab.com/slumber/multi-user into 49-connection-preset-system

This commit is contained in:
Fabian 2021-06-08 15:00:50 +02:00
commit 3e41b18af1
32 changed files with 929 additions and 381 deletions

View File

@ -2,9 +2,12 @@ stages:
- test
- build
- deploy
- doc
include:
- local: .gitlab/ci/test.gitlab-ci.yml
- local: .gitlab/ci/build.gitlab-ci.yml
- local: .gitlab/ci/deploy.gitlab-ci.yml
- local: .gitlab/ci/doc.gitlab-ci.yml

View File

@ -1,5 +1,6 @@
build:
stage: build
needs: ["test"]
image: debian:stable-slim
script:
- rm -rf tests .git .gitignore script

View File

@ -1,5 +1,6 @@
deploy:
stage: deploy
needs: ["build"]
image: slumber/docker-python
variables:
DOCKER_DRIVER: overlay2

View File

@ -0,0 +1,16 @@
pages:
stage: doc
needs: ["deploy"]
image: python
script:
- pip install -U sphinx sphinx_rtd_theme sphinx-material
- sphinx-build -b html ./docs public
artifacts:
paths:
- public
only:
refs:
- master
- develop

View File

@ -157,4 +157,33 @@ All notable changes to this project will be documented in this file.
- Empty and Light object selection highlights
- Material renaming
- Default material nodes input parameters
- blender 2.91 python api compatibility
- blender 2.91 python api compatibility
## [0.3.0] - 2021-04-14
### Added
- Curve material support
- Cycle visibility settings
- Session save/load operator
- Add new scene support
- Physic initial support
- Geometry node initial support
- Blender 2.93 compatibility
### Changed
- Host documentation on Gitlab Page
- Event driven update (from the blender deps graph)
### Fixed
- Vertex group assignation
- Parent relation can't be removed
- Separate object
- Delete animation
- Sync missing holdout option for grease pencil material
- Sync missing `skin_vertices`
- Exception access violation during Undo/Redo
- Sync missing armature bone Roll
- Sync missing driver data_path
- Constraint replication

View File

@ -19,44 +19,46 @@ This tool aims to allow multiple users to work on the same scene over the networ
## Usage
See the [documentation](https://multi-user.readthedocs.io/en/latest/) for details.
See the [documentation](https://slumber.gitlab.io/multi-user/index.html) for details.
## Troubleshooting
See the [troubleshooting guide](https://multi-user.readthedocs.io/en/latest/getting_started/troubleshooting.html) for tips on the most common issues.
See the [troubleshooting guide](https://slumber.gitlab.io/multi-user/getting_started/troubleshooting.html) for tips on the most common issues.
## Current development status
Currently, not all data-block are supported for replication over the wire. The following list summarizes the status for each ones.
| Name | Status | Comment |
| ----------- | :----: | :--------------------------------------------------------------------------: |
| action | ✔️ | |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | ❗ | Nurbs not supported |
| gpencil | ✔️ | [Airbrush not supported](https://gitlab.com/slumber/multi-user/-/issues/123) |
| image | ✔️ | |
| mesh | ✔️ | |
| material | ✔️ | |
| node_groups | ❗ | Material only |
| metaball | ✔️ | |
| object | ✔️ | |
| textures | ❗ | Supported for modifiers only |
| texts | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| lightprobes | ✔️ | |
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | ❌ | |
| volumes | ✔️ | |
| particles | ❌ | [On-going](https://gitlab.com/slumber/multi-user/-/issues/24) |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❗ | Mask and Clip not supported yet |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial |
| Name | Status | Comment |
| -------------- | :----: | :----------------------------------------------------------: |
| action | ✔️ | |
| armature | ❗ | Not stable |
| camera | ✔️ | |
| collection | ✔️ | |
| curve | ❗ | Nurbs surfaces not supported |
| gpencil | ✔️ | |
| image | ✔️ | |
| mesh | ✔️ | |
| material | ✔️ | |
| node_groups | ❗ | Material & Geometry only |
| geometry nodes | ✔️ | |
| metaball | ✔️ | |
| object | ✔️ | |
| textures | ❗ | Supported for modifiers/materials/geo nodes only |
| texts | ✔️ | |
| scene | ✔️ | |
| world | ✔️ | |
| lightprobes | ✔️ | |
| compositing | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/46) |
| texts | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/81) |
| nla | ❌ | |
| volumes | ✔️ | |
| particles | ❗ | The cache isn't syncing. |
| speakers | ❗ | [Partial](https://gitlab.com/slumber/multi-user/-/issues/65) |
| vse | ❗ | Mask and Clip not supported yet |
| physics | ❌ | [Planned](https://gitlab.com/slumber/multi-user/-/issues/45) |
| libraries | ❗ | Partial |
### Performance issues
@ -74,7 +76,7 @@ I'm working on it.
## Contributing
See [contributing section](https://multi-user.readthedocs.io/en/latest/ways_to_contribute.html) of the documentation.
See [contributing section](https://slumber.gitlab.io/multi-user/ways_to_contribute.html) of the documentation.
Feel free to [join the discord server](https://discord.gg/aBPvGws) to chat, seek help and contribute.

View File

@ -374,15 +374,6 @@ Network
Advanced network settings
**IPC Port** is the port used for Inter Process Communication. This port is used
by the multi-user subprocesses to communicate with each other. If different instances
of multi-user are using the same IPC port, this will create conflict !
.. note::
You only need to modify this setting if you need to launch multiple clients from the same
computer (or if you try to host and join from the same computer). To resolve this, you simply need to enter a different
**IPC port** for each blender instance.
**Timeout (in milliseconds)** is the maximum ping authorized before auto-disconnecting.
You should only increase it if you have a bad connection.

View File

@ -19,7 +19,7 @@
bl_info = {
"name": "Multi-User",
"author": "Swann Martinez",
"version": (0, 3, 0),
"version": (0, 4, 0),
"description": "Enable real-time collaborative workflow inside blender",
"blender": (2, 82, 0),
"location": "3D View > Sidebar > Multi-User tab",
@ -44,7 +44,7 @@ from . import environment
DEPENDENCIES = {
("replication", '0.1.26'),
("replication", '0.1.36'),
}

View File

@ -122,13 +122,13 @@ class addon_updater_install_popup(bpy.types.Operator):
# if true, run clean install - ie remove all files before adding new
# equivalent to deleting the addon and reinstalling, except the
# updater folder/backup folder remains
clean_install = bpy.props.BoolProperty(
clean_install: bpy.props.BoolProperty(
name="Clean install",
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
default=False,
options={'HIDDEN'}
)
ignore_enum = bpy.props.EnumProperty(
ignore_enum: bpy.props.EnumProperty(
name="Process update",
description="Decide to install, ignore, or defer new addon update",
items=[
@ -264,7 +264,7 @@ class addon_updater_update_now(bpy.types.Operator):
# if true, run clean install - ie remove all files before adding new
# equivalent to deleting the addon and reinstalling, except the
# updater folder/backup folder remains
clean_install = bpy.props.BoolProperty(
clean_install: bpy.props.BoolProperty(
name="Clean install",
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
default=False,
@ -332,7 +332,7 @@ class addon_updater_update_target(bpy.types.Operator):
i+=1
return ret
target = bpy.props.EnumProperty(
target: bpy.props.EnumProperty(
name="Target version to install",
description="Select the version to install",
items=target_version
@ -341,7 +341,7 @@ class addon_updater_update_target(bpy.types.Operator):
# if true, run clean install - ie remove all files before adding new
# equivalent to deleting the addon and reinstalling, except the
# updater folder/backup folder remains
clean_install = bpy.props.BoolProperty(
clean_install: bpy.props.BoolProperty(
name="Clean install",
description="If enabled, completely clear the addon's folder before installing new update, creating a fresh install",
default=False,
@ -399,7 +399,7 @@ class addon_updater_install_manually(bpy.types.Operator):
bl_description = "Proceed to manually install update"
bl_options = {'REGISTER', 'INTERNAL'}
error = bpy.props.StringProperty(
error: bpy.props.StringProperty(
name="Error Occurred",
default="",
options={'HIDDEN'}
@ -461,7 +461,7 @@ class addon_updater_updated_successful(bpy.types.Operator):
bl_description = "Update installation response"
bl_options = {'REGISTER', 'INTERNAL', 'UNDO'}
error = bpy.props.StringProperty(
error: bpy.props.StringProperty(
name="Error Occurred",
default="",
options={'HIDDEN'}

View File

@ -42,13 +42,14 @@ __all__ = [
# 'bl_sequencer',
'bl_node_group',
'bl_texture',
"bl_particle",
] # Order here defines execution order
if bpy.app.version[1] >= 91:
__all__.append('bl_volume')
from . import *
from replication.data import ReplicatedDataFactory
from replication.data import DataTranslationProtocol
def types_to_register():
return __all__

View File

@ -25,7 +25,7 @@ from enum import Enum
from .. import utils
from .dump_anything import (
Dumper, Loader, np_dump_collection, np_load_collection, remove_items_from_dict)
from .bl_datablock import BlDatablock
from .bl_datablock import BlDatablock, has_action, has_driver, dump_driver, load_driver
KEYFRAME = [
@ -61,7 +61,6 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
points = fcurve.keyframe_points
fcurve_data['keyframes_count'] = len(fcurve.keyframe_points)
fcurve_data['keyframe_points'] = np_dump_collection(points, KEYFRAME)
else: # Legacy method
dumper = Dumper()
fcurve_data["keyframe_points"] = []
@ -71,6 +70,18 @@ def dump_fcurve(fcurve: bpy.types.FCurve, use_numpy: bool = True) -> dict:
dumper.dump(k)
)
if fcurve.modifiers:
dumper = Dumper()
dumper.exclude_filter = [
'is_valid',
'active'
]
dumped_modifiers = []
for modfifier in fcurve.modifiers:
dumped_modifiers.append(dumper.dump(modfifier))
fcurve_data['modifiers'] = dumped_modifiers
return fcurve_data
@ -83,7 +94,7 @@ def load_fcurve(fcurve_data, fcurve):
:type fcurve: bpy.types.FCurve
"""
use_numpy = fcurve_data.get('use_numpy')
loader = Loader()
keyframe_points = fcurve.keyframe_points
# Remove all keyframe points
@ -128,6 +139,64 @@ def load_fcurve(fcurve_data, fcurve):
fcurve.update()
dumped_fcurve_modifiers = fcurve_data.get('modifiers', None)
if dumped_fcurve_modifiers:
# clear modifiers
for fmod in fcurve.modifiers:
fcurve.modifiers.remove(fmod)
# Load each modifiers in order
for modifier_data in dumped_fcurve_modifiers:
modifier = fcurve.modifiers.new(modifier_data['type'])
loader.load(modifier, modifier_data)
elif fcurve.modifiers:
for fmod in fcurve.modifiers:
fcurve.modifiers.remove(fmod)
def dump_animation_data(datablock):
animation_data = {}
if has_action(datablock):
animation_data['action'] = datablock.animation_data.action.name
if has_driver(datablock):
animation_data['drivers'] = []
for driver in datablock.animation_data.drivers:
animation_data['drivers'].append(dump_driver(driver))
return animation_data
def load_animation_data(animation_data, datablock):
# Load animation data
if animation_data:
if datablock.animation_data is None:
datablock.animation_data_create()
for d in datablock.animation_data.drivers:
datablock.animation_data.drivers.remove(d)
if 'drivers' in animation_data:
for driver in animation_data['drivers']:
load_driver(datablock, driver)
if 'action' in animation_data:
datablock.animation_data.action = bpy.data.actions[animation_data['action']]
elif datablock.animation_data.action:
datablock.animation_data.action = None
# Remove existing animation data if there is not more to load
elif hasattr(datablock, 'animation_data') and datablock.animation_data:
datablock.animation_data_clear()
def resolve_animation_dependencies(datablock):
if has_action(datablock):
return [datablock.animation_data.action]
else:
return []
class BlAction(BlDatablock):
bl_id = "actions"

View File

@ -56,6 +56,11 @@ class BlCamera(BlDatablock):
target_img.image = bpy.data.images[img_id]
loader.load(target_img, img_data)
img_user = img_data.get('image_user')
if img_user:
loader.load(target_img.image_user, img_user)
def _dump_implementation(self, data, instance=None):
assert(instance)
@ -101,10 +106,19 @@ class BlCamera(BlDatablock):
'scale',
'use_flip_x',
'use_flip_y',
'image'
'image_user',
'image',
'frame_duration',
'frame_start',
'frame_offset',
'use_cyclic',
'use_auto_refresh'
]
return dumper.dump(instance)
data = dumper.dump(instance)
for index, image in enumerate(instance.background_images):
if image.image_user:
data['background_images'][index]['image_user'] = dumper.dump(image.image_user)
return data
def _resolve_deps_implementation(self):
deps = []
for background in self.instance.background_images:

View File

@ -72,10 +72,10 @@ def load_driver(target_datablock, src_driver):
for src_target in src_var_data['targets']:
src_target_data = src_var_data['targets'][src_target]
new_var.targets[src_target].id = utils.resolve_from_id(
src_target_data['id'], src_target_data['id_type'])
loader.load(
new_var.targets[src_target], src_target_data)
src_id = src_target_data.get('id')
if src_id:
new_var.targets[src_target].id = utils.resolve_from_id(src_target_data['id'], src_target_data['id_type'])
loader.load(new_var.targets[src_target], src_target_data)
# Fcurve
new_fcurve = new_driver.keyframe_points
@ -127,16 +127,14 @@ class BlDatablock(ReplicatedDatablock):
instance.uuid = self.uuid
def resolve(self, construct = True):
datablock_ref = None
datablock_root = getattr(bpy.data, self.bl_id)
try:
datablock_ref = datablock_root[self.data['name']]
except Exception:
pass
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
if not datablock_ref:
datablock_ref = utils.find_from_attr('uuid', self.uuid, datablock_root)
try:
datablock_ref = datablock_root[self.data['name']]
except Exception:
pass
if construct and not datablock_ref:
name = self.data.get('name')
@ -163,19 +161,17 @@ class BlDatablock(ReplicatedDatablock):
def _dump(self, instance=None):
dumper = Dumper()
data = {}
animation_data = {}
# Dump animation data
if has_action(instance):
dumper = Dumper()
dumper.include_filter = ['action']
data['animation_data'] = dumper.dump(instance.animation_data)
animation_data['action'] = instance.animation_data.action.name
if has_driver(instance):
dumped_drivers = {'animation_data': {'drivers': []}}
animation_data['drivers'] = []
for driver in instance.animation_data.drivers:
dumped_drivers['animation_data']['drivers'].append(
dump_driver(driver))
animation_data['drivers'].append(dump_driver(driver))
data.update(dumped_drivers)
if animation_data:
data['animation_data'] = animation_data
if self.is_library:
data.update(dumper.dump(instance))
@ -202,6 +198,9 @@ class BlDatablock(ReplicatedDatablock):
if 'action' in data['animation_data']:
target.animation_data.action = bpy.data.actions[data['animation_data']['action']]
elif target.animation_data.action:
target.animation_data.action = None
# Remove existing animation data if there is not more to load
elif hasattr(target, 'animation_data') and target.animation_data:
target.animation_data_clear()

View File

@ -68,12 +68,15 @@ class BlFile(ReplicatedDatablock):
self.preferences = utils.get_preferences()
def resolve(self, construct = True):
if self.data:
self.instance = Path(get_filepath(self.data['name']))
self.instance = Path(get_filepath(self.data['name']))
file_exists = self.instance.exists()
if not file_exists:
logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance)
return file_exists
if not self.instance.exists():
logging.debug("File don't exist, loading it.")
self._load(self.data, self.instance)
def push(self, socket, identity=None, check_data=False):
super().push(socket, identity=None, check_data=False)
@ -131,6 +134,8 @@ class BlFile(ReplicatedDatablock):
if self.preferences.clear_memory_filecache:
return False
else:
if not self.instance:
return False
memory_size = sys.getsizeof(self.data['file'])-33
disk_size = self.instance.stat().st_size
return memory_size != disk_size

View File

@ -68,7 +68,10 @@ class BlImage(BlDatablock):
target.source = 'FILE'
target.filepath_raw = get_filepath(data['filename'])
target.colorspace_settings.name = data["colorspace_settings"]["name"]
color_space_name = data["colorspace_settings"]["name"]
if color_space_name:
target.colorspace_settings.name = color_space_name
def _dump(self, instance=None):
assert(instance)
@ -83,6 +86,7 @@ class BlImage(BlDatablock):
dumper.depth = 2
dumper.include_filter = [
"name",
# 'source',
'size',
'height',
'alpha',

View File

@ -27,7 +27,7 @@ from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock, get_datablock_from_uuid
NODE_SOCKET_INDEX = re.compile('\[(\d*)\]')
IGNORED_SOCKETS = ['GEOMETRY', 'SHADER', 'CUSTOM']
def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
""" Load a node into a node_tree from a dict
@ -52,31 +52,137 @@ def load_node(node_data: dict, node_tree: bpy.types.ShaderNodeTree):
inputs_data = node_data.get('inputs')
if inputs_data:
inputs = target_node.inputs
for idx, inpt in enumerate(inputs_data):
if idx < len(inputs) and hasattr(inputs[idx], "default_value"):
inputs = [i for i in target_node.inputs if i.type not in IGNORED_SOCKETS]
for idx, inpt in enumerate(inputs):
if idx < len(inputs_data) and hasattr(inpt, "default_value"):
loaded_input = inputs_data[idx]
try:
inputs[idx].default_value = inpt
if inpt.type in ['OBJECT', 'COLLECTION']:
inpt.default_value = get_datablock_from_uuid(loaded_input, None)
else:
inpt.default_value = loaded_input
except Exception as e:
logging.warning(f"Node {target_node.name} input {inputs[idx].name} parameter not supported, skipping ({e})")
logging.warning(f"Node {target_node.name} input {inpt.name} parameter not supported, skipping ({e})")
else:
logging.warning(f"Node {target_node.name} input length mismatch.")
outputs_data = node_data.get('outputs')
if outputs_data:
outputs = target_node.outputs
for idx, output in enumerate(outputs_data):
if idx < len(outputs) and hasattr(outputs[idx], "default_value"):
outputs = [o for o in target_node.outputs if o.type not in IGNORED_SOCKETS]
for idx, output in enumerate(outputs):
if idx < len(outputs_data) and hasattr(output, "default_value"):
loaded_output = outputs_data[idx]
try:
outputs[idx].default_value = output
except:
if output.type in ['OBJECT', 'COLLECTION']:
output.default_value = get_datablock_from_uuid(loaded_output, None)
else:
output.default_value = loaded_output
except Exception as e:
logging.warning(
f"Node {target_node.name} output {outputs[idx].name} parameter not supported, skipping ({e})")
f"Node {target_node.name} output {output.name} parameter not supported, skipping ({e})")
else:
logging.warning(
f"Node {target_node.name} output length mismatch.")
def dump_node(node: bpy.types.ShaderNode) -> dict:
""" Dump a single node to a dict
:arg node: target node
:type node: bpy.types.Node
:retrun: dict
"""
node_dumper = Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded",
"name_full",
"select",
"bl_label",
"bl_height_min",
"bl_height_max",
"bl_height_default",
"bl_width_min",
"bl_width_max",
"type",
"bl_icon",
"bl_width_default",
"bl_static_type",
"show_tetxure",
"is_active_output",
"hide",
"show_options",
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
if node.parent:
dumped_node['parent'] = node.parent.name
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
if dump_io_needed:
io_dumper = Dumper()
io_dumper.depth = 2
io_dumper.include_filter = ["default_value"]
if hasattr(node, 'inputs'):
dumped_node['inputs'] = []
inputs = [i for i in node.inputs if i.type not in IGNORED_SOCKETS]
for idx, inpt in enumerate(inputs):
if hasattr(inpt, 'default_value'):
if isinstance(inpt.default_value, bpy.types.ID):
dumped_input = inpt.default_value.uuid
else:
dumped_input = io_dumper.dump(inpt.default_value)
dumped_node['inputs'].append(dumped_input)
if hasattr(node, 'outputs'):
dumped_node['outputs'] = []
for idx, output in enumerate(node.outputs):
if output.type not in IGNORED_SOCKETS:
if hasattr(output, 'default_value'):
dumped_node['outputs'].append(
io_dumper.dump(output.default_value))
if hasattr(node, 'color_ramp'):
ramp_dumper = Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position',
'interpolation',
'hue_interpolation',
'color_mode'
]
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
dumped_node['node_tree_uuid'] = node.node_tree.uuid
return dumped_node
def load_links(links_data, node_tree):
""" Load node_tree links from a list
@ -119,94 +225,7 @@ def dump_links(links):
return links_data
def dump_node(node: bpy.types.ShaderNode) -> dict:
""" Dump a single node to a dict
:arg node: target node
:type node: bpy.types.Node
:retrun: dict
"""
node_dumper = Dumper()
node_dumper.depth = 1
node_dumper.exclude_filter = [
"dimensions",
"show_expanded",
"name_full",
"select",
"bl_label",
"bl_height_min",
"bl_height_max",
"bl_height_default",
"bl_width_min",
"bl_width_max",
"type",
"bl_icon",
"bl_width_default",
"bl_static_type",
"show_tetxure",
"is_active_output",
"hide",
"show_options",
"show_preview",
"show_texture",
"outputs",
"width_hidden",
"image"
]
dumped_node = node_dumper.dump(node)
dump_io_needed = (node.type not in ['REROUTE', 'OUTPUT_MATERIAL'])
if dump_io_needed:
io_dumper = Dumper()
io_dumper.depth = 2
io_dumper.include_filter = ["default_value"]
if hasattr(node, 'inputs'):
dumped_node['inputs'] = []
for idx, inpt in enumerate(node.inputs):
if hasattr(inpt, 'default_value'):
dumped_node['inputs'].append(
io_dumper.dump(inpt.default_value))
if hasattr(node, 'outputs'):
dumped_node['outputs'] = []
for idx, output in enumerate(node.outputs):
if hasattr(output, 'default_value'):
dumped_node['outputs'].append(
io_dumper.dump(output.default_value))
if hasattr(node, 'color_ramp'):
ramp_dumper = Dumper()
ramp_dumper.depth = 4
ramp_dumper.include_filter = [
'elements',
'alpha',
'color',
'position',
'interpolation',
'color_mode'
]
dumped_node['color_ramp'] = ramp_dumper.dump(node.color_ramp)
if hasattr(node, 'mapping'):
curve_dumper = Dumper()
curve_dumper.depth = 5
curve_dumper.include_filter = [
'curves',
'points',
'location'
]
dumped_node['mapping'] = curve_dumper.dump(node.mapping)
if hasattr(node, 'image') and getattr(node, 'image'):
dumped_node['image_uuid'] = node.image.uuid
if hasattr(node, 'node_tree') and getattr(node, 'node_tree'):
dumped_node['node_tree_uuid'] = node.node_tree.uuid
return dumped_node
def dump_shader_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
def dump_node_tree(node_tree: bpy.types.ShaderNodeTree) -> dict:
""" Dump a shader node_tree to a dict including links and nodes
:arg node_tree: dumped shader node tree
@ -262,7 +281,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
"""
# Check for removed sockets
for socket in sockets:
if not [s for s in sockets_data if socket['uuid'] == s[2]]:
if not [s for s in sockets_data if 'uuid' in socket and socket['uuid'] == s[2]]:
sockets.remove(socket)
# Check for new sockets
@ -276,7 +295,7 @@ def load_node_tree_sockets(sockets: bpy.types.Collection,
s['uuid'] = socket_data[2]
def load_shader_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
def load_node_tree(node_tree_data: dict, target_node_tree: bpy.types.ShaderNodeTree) -> dict:
"""Load a shader node_tree from dumped data
:arg node_tree_data: dumped node data
@ -302,6 +321,14 @@ def load_shader_node_tree(node_tree_data: dict, target_node_tree: bpy.types.Shad
for node in node_tree_data["nodes"]:
load_node(node_tree_data["nodes"][node], target_node_tree)
for node_id, node_data in node_tree_data["nodes"].items():
target_node = target_node_tree.nodes.get(node_id, None)
if target_node is None:
continue
elif 'parent' in node_data:
target_node.parent = target_node_tree.nodes[node_data['parent']]
else:
target_node.parent = None
# TODO: load only required nodes links
# Load nodes links
target_node_tree.links.clear()
@ -316,6 +343,8 @@ def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
def has_node_group(node): return (
hasattr(node, 'node_tree') and node.node_tree)
def has_texture(node): return (
node.type in ['ATTRIBUTE_SAMPLE_TEXTURE','TEXTURE'] and node.texture)
deps = []
for node in node_tree.nodes:
@ -323,6 +352,8 @@ def get_node_tree_dependencies(node_tree: bpy.types.NodeTree) -> list:
deps.append(node.image)
elif has_node_group(node):
deps.append(node.node_tree)
elif has_texture(node):
deps.append(node.texture)
return deps
@ -353,10 +384,7 @@ def load_materials_slots(src_materials: list, dst_materials: bpy.types.bpy_prop_
if mat_uuid is not None:
mat_ref = get_datablock_from_uuid(mat_uuid, None)
else:
mat_ref = bpy.data.materials.get(mat_name, None)
if mat_ref is None:
raise Exception(f"Material {mat_name} doesn't exist")
mat_ref = bpy.data.materials[mat_name]
dst_materials.append(mat_ref)
@ -387,7 +415,7 @@ class BlMaterial(BlDatablock):
if target.node_tree is None:
target.use_nodes = True
load_shader_node_tree(data['node_tree'], target.node_tree)
load_node_tree(data['node_tree'], target.node_tree)
def _dump_implementation(self, data, instance=None):
assert(instance)
@ -454,7 +482,7 @@ class BlMaterial(BlDatablock):
]
data['grease_pencil'] = gp_mat_dumper.dump(instance.grease_pencil)
elif instance.use_nodes:
data['node_tree'] = dump_shader_node_tree(instance.node_tree)
data['node_tree'] = dump_node_tree(instance.node_tree)
return data

View File

@ -21,13 +21,13 @@ import mathutils
from .dump_anything import Dumper, Loader, np_dump_collection, np_load_collection
from .bl_datablock import BlDatablock
from .bl_material import (dump_shader_node_tree,
load_shader_node_tree,
from .bl_material import (dump_node_tree,
load_node_tree,
get_node_tree_dependencies)
class BlNodeGroup(BlDatablock):
bl_id = "node_groups"
bl_class = bpy.types.ShaderNodeTree
bl_class = bpy.types.NodeTree
bl_check_common = False
bl_icon = 'NODETREE'
bl_reload_parent = False
@ -36,10 +36,10 @@ class BlNodeGroup(BlDatablock):
return bpy.data.node_groups.new(data["name"], data["type"])
def _load_implementation(self, data, target):
load_shader_node_tree(data, target)
load_node_tree(data, target)
def _dump_implementation(self, data, instance=None):
return dump_shader_node_tree(instance)
return dump_node_tree(instance)
def _resolve_deps_implementation(self):
return get_node_tree_dependencies(self.instance)

View File

@ -17,12 +17,14 @@
import logging
import re
import bpy
import mathutils
from replication.exception import ContextError
from .bl_datablock import BlDatablock, get_datablock_from_uuid
from .bl_material import IGNORED_SOCKETS
from .bl_action import dump_animation_data, load_animation_data, resolve_animation_dependencies
from .dump_anything import (
Dumper,
Loader,
@ -36,6 +38,127 @@ SKIN_DATA = [
'use_root'
]
SHAPEKEY_BLOCK_ATTR = [
'mute',
'value',
'slider_min',
'slider_max',
]
if bpy.app.version[1] >= 93:
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str, float)
else:
SUPPORTED_GEOMETRY_NODE_PARAMETERS = (int, str)
logging.warning("Geometry node Float parameter not supported in \
blender 2.92.")
def get_node_group_inputs(node_group):
inputs = []
for inpt in node_group.inputs:
if inpt.type in IGNORED_SOCKETS:
continue
else:
inputs.append(inpt)
return inputs
# return [inpt.identifer for inpt in node_group.inputs if inpt.type not in IGNORED_SOCKETS]
def dump_physics(target: bpy.types.Object)->dict:
"""
Dump all physics settings from a given object excluding modifier
related physics settings (such as softbody, cloth, dynapaint and fluid)
"""
dumper = Dumper()
dumper.depth = 1
physics_data = {}
# Collisions (collision)
if target.collision and target.collision.use:
physics_data['collision'] = dumper.dump(target.collision)
# Field (field)
if target.field and target.field.type != "NONE":
physics_data['field'] = dumper.dump(target.field)
# Rigid Body (rigid_body)
if target.rigid_body:
physics_data['rigid_body'] = dumper.dump(target.rigid_body)
# Rigid Body constraint (rigid_body_constraint)
if target.rigid_body_constraint:
physics_data['rigid_body_constraint'] = dumper.dump(target.rigid_body_constraint)
return physics_data
def load_physics(dumped_settings: dict, target: bpy.types.Object):
""" Load all physics settings from a given object excluding modifier
related physics settings (such as softbody, cloth, dynapaint and fluid)
"""
loader = Loader()
if 'collision' in dumped_settings:
loader.load(target.collision, dumped_settings['collision'])
if 'field' in dumped_settings:
loader.load(target.field, dumped_settings['field'])
if 'rigid_body' in dumped_settings:
if not target.rigid_body:
bpy.ops.rigidbody.object_add({"object": target})
loader.load(target.rigid_body, dumped_settings['rigid_body'])
elif target.rigid_body:
bpy.ops.rigidbody.object_remove({"object": target})
if 'rigid_body_constraint' in dumped_settings:
if not target.rigid_body_constraint:
bpy.ops.rigidbody.constraint_add({"object": target})
loader.load(target.rigid_body_constraint, dumped_settings['rigid_body_constraint'])
elif target.rigid_body_constraint:
bpy.ops.rigidbody.constraint_remove({"object": target})
def dump_modifier_geometry_node_inputs(modifier: bpy.types.Modifier) -> list:
""" Dump geometry node modifier input properties
:arg modifier: geometry node modifier to dump
:type modifier: bpy.type.Modifier
"""
dumped_inputs = []
for inpt in get_node_group_inputs(modifier.node_group):
input_value = modifier[inpt.identifier]
dumped_input = None
if isinstance(input_value, bpy.types.ID):
dumped_input = input_value.uuid
elif isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
dumped_input = input_value
elif hasattr(input_value, 'to_list'):
dumped_input = input_value.to_list()
dumped_inputs.append(dumped_input)
return dumped_inputs
def load_modifier_geometry_node_inputs(dumped_modifier: dict, target_modifier: bpy.types.Modifier):
""" Load geometry node modifier inputs
:arg dumped_modifier: source dumped modifier to load
:type dumped_modifier: dict
:arg target_modifier: target geometry node modifier
:type target_modifier: bpy.type.Modifier
"""
for input_index, inpt in enumerate(get_node_group_inputs(target_modifier.node_group)):
dumped_value = dumped_modifier['inputs'][input_index]
input_value = target_modifier[inpt.identifier]
if isinstance(input_value, SUPPORTED_GEOMETRY_NODE_PARAMETERS):
target_modifier[inpt.identifier] = dumped_value
elif hasattr(input_value, 'to_list'):
for index in range(len(input_value)):
input_value[index] = dumped_value[index]
elif inpt.type in ['COLLECTION', 'OBJECT']:
target_modifier[inpt.identifier] = get_datablock_from_uuid(
dumped_value, None)
def load_pose(target_bone, data):
target_bone.rotation_mode = data['rotation_mode']
loader = Loader()
@ -91,19 +214,43 @@ def _is_editmode(object: bpy.types.Object) -> bool:
child_data.is_editmode)
def find_textures_dependencies(collection):
""" Check collection
def find_textures_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.Texture]:
""" Find textures lying in a modifier stack
:arg modifiers: modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
:return: list of bpy.types.Texture pointers
"""
textures = []
for item in collection:
for attr in dir(item):
inst = getattr(item, attr)
if issubclass(type(inst), bpy.types.Texture) and inst is not None:
textures.append(inst)
for mod in modifiers:
modifier_attributes = [getattr(mod, attr_name)
for attr_name in mod.bl_rna.properties.keys()]
for attr in modifier_attributes:
if issubclass(type(attr), bpy.types.Texture) and attr is not None:
textures.append(attr)
return textures
def find_geometry_nodes_dependencies(modifiers: bpy.types.bpy_prop_collection) -> [bpy.types.NodeTree]:
""" Find geometry nodes dependencies from a modifier stack
:arg modifiers: modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
:return: list of bpy.types.NodeTree pointers
"""
dependencies = []
for mod in modifiers:
if mod.type == 'NODES' and mod.node_group:
dependencies.append(mod.node_group)
# for inpt in get_node_group_inputs(mod.node_group):
# parameter = mod.get(inpt.identifier)
# if parameter and isinstance(parameter, bpy.types.ID):
# dependencies.append(parameter)
return dependencies
def dump_vertex_groups(src_object: bpy.types.Object) -> dict:
""" Dump object's vertex groups
@ -148,6 +295,147 @@ def load_vertex_groups(dumped_vertex_groups: dict, target_object: bpy.types.Obje
for index, weight in vg['vertices']:
vertex_group.add([index], weight, 'REPLACE')
def dump_shape_keys(target_key: bpy.types.Key)->dict:
""" Dump the target shape_keys datablock to a dict using numpy
:param dumped_key: target key datablock
:type dumped_key: bpy.types.Key
:return: dict
"""
dumped_key_blocks = []
dumper = Dumper()
dumper.include_filter = [
'name',
'mute',
'value',
'slider_min',
'slider_max',
]
for key in target_key.key_blocks:
dumped_key_block = dumper.dump(key)
dumped_key_block['data'] = np_dump_collection(key.data, ['co'])
dumped_key_block['relative_key'] = key.relative_key.name
dumped_key_blocks.append(dumped_key_block)
return {
'reference_key': target_key.reference_key.name,
'use_relative': target_key.use_relative,
'key_blocks': dumped_key_blocks,
'animation_data': dump_animation_data(target_key)
}
def load_shape_keys(dumped_shape_keys: dict, target_object: bpy.types.Object):
""" Load the target shape_keys datablock to a dict using numpy
:param dumped_key: src key data
:type dumped_key: bpy.types.Key
:param target_object: object used to load the shapekeys data onto
:type target_object: bpy.types.Object
"""
loader = Loader()
# Remove existing ones
target_object.shape_key_clear()
# Create keys and load vertices coords
dumped_key_blocks = dumped_shape_keys.get('key_blocks')
for dumped_key_block in dumped_key_blocks:
key_block = target_object.shape_key_add(name=dumped_key_block['name'])
loader.load(key_block, dumped_key_block)
np_load_collection(dumped_key_block['data'], key_block.data, ['co'])
# Load relative key after all
for dumped_key_block in dumped_key_blocks:
relative_key_name = dumped_key_block.get('relative_key')
key_name = dumped_key_block.get('name')
target_keyblock = target_object.data.shape_keys.key_blocks[key_name]
relative_key = target_object.data.shape_keys.key_blocks[relative_key_name]
target_keyblock.relative_key = relative_key
# Shape keys animation data
anim_data = dumped_shape_keys.get('animation_data')
if anim_data:
load_animation_data(anim_data, target_object.data.shape_keys)
def dump_modifiers(modifiers: bpy.types.bpy_prop_collection)->dict:
""" Dump all modifiers of a modifier collection into a dict
:param modifiers: modifiers
:type modifiers: bpy.types.bpy_prop_collection
:return: dict
"""
dumped_modifiers = {}
dumper = Dumper()
dumper.depth = 1
dumper.exclude_filter = ['is_active']
for index, modifier in enumerate(modifiers):
dumped_modifier = dumper.dump(modifier)
# hack to dump geometry nodes inputs
if modifier.type == 'NODES':
dumped_inputs = dump_modifier_geometry_node_inputs(
modifier)
dumped_modifier['inputs'] = dumped_inputs
elif modifier.type == 'PARTICLE_SYSTEM':
dumper.exclude_filter = [
"is_edited",
"is_editable",
"is_global_hair"
]
dumped_modifier['particle_system'] = dumper.dump(modifier.particle_system)
dumped_modifier['particle_system']['settings_uuid'] = modifier.particle_system.settings.uuid
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
dumped_modifier['settings'] = dumper.dump(modifier.settings)
elif modifier.type == 'UV_PROJECT':
dumped_modifier['projectors'] =[p.object.name for p in modifier.projectors if p and p.object]
dumped_modifiers[modifier.name] = dumped_modifier
return dumped_modifiers
def load_modifiers_custom_data(dumped_modifiers: dict, modifiers: bpy.types.bpy_prop_collection):
""" Load modifiers custom data not managed by the dump_anything loader
:param dumped_modifiers: modifiers to load
:type dumped_modifiers: dict
:param modifiers: target modifiers collection
:type modifiers: bpy.types.bpy_prop_collection
"""
loader = Loader()
for modifier in modifiers:
dumped_modifier = dumped_modifiers.get(modifier.name)
if modifier.type == 'NODES':
load_modifier_geometry_node_inputs(dumped_modifier, modifier)
elif modifier.type == 'PARTICLE_SYSTEM':
default = modifier.particle_system.settings
dumped_particles = dumped_modifier['particle_system']
loader.load(modifier.particle_system, dumped_particles)
settings = get_datablock_from_uuid(dumped_particles['settings_uuid'], None)
if settings:
modifier.particle_system.settings = settings
# Hack to remove the default generated particle settings
if not default.uuid:
bpy.data.particles.remove(default)
elif modifier.type in ['SOFT_BODY', 'CLOTH']:
loader.load(modifier.settings, dumped_modifier['settings'])
elif modifier.type == 'UV_PROJECT':
for projector_index, projector_object in enumerate(dumped_modifier['projectors']):
target_object = bpy.data.objects.get(projector_object)
if target_object:
modifier.projectors[projector_index].object = target_object
else:
logging.error("Could't load projector target object {projector_object}")
class BlObject(BlDatablock):
bl_id = "objects"
bl_class = bpy.types.Object
@ -171,13 +459,14 @@ class BlObject(BlDatablock):
object_name = data.get("name")
data_uuid = data.get("data_uuid")
data_id = data.get("data")
data_type = data.get("type")
object_data = get_datablock_from_uuid(
data_uuid,
find_data_from_name(data_id),
ignore=['images']) # TODO: use resolve_from_id
if object_data is None and data_uuid:
if data_type != 'EMPTY' and object_data is None:
raise Exception(f"Fail to load object {data['name']}({self.uuid})")
instance = bpy.data.objects.new(object_name, object_data)
@ -203,31 +492,27 @@ class BlObject(BlDatablock):
object_data = target.data
# SHAPE KEYS
if 'shape_keys' in data:
target.shape_key_clear()
# Create keys and load vertices coords
for key_block in data['shape_keys']['key_blocks']:
key_data = data['shape_keys']['key_blocks'][key_block]
target.shape_key_add(name=key_block)
loader.load(
target.data.shape_keys.key_blocks[key_block], key_data)
for vert in key_data['data']:
target.data.shape_keys.key_blocks[key_block].data[vert].co = key_data['data'][vert]['co']
# Load relative key after all
for key_block in data['shape_keys']['key_blocks']:
reference = data['shape_keys']['key_blocks'][key_block]['relative_key']
target.data.shape_keys.key_blocks[key_block].relative_key = target.data.shape_keys.key_blocks[reference]
shape_keys = data.get('shape_keys')
if shape_keys:
load_shape_keys(shape_keys, target)
# Load transformation data
loader.load(target, data)
# Object display fields
if 'display' in data:
loader.load(target.display, data['display'])
# Parenting
parent_id = data.get('parent_uid')
if parent_id:
parent = get_datablock_from_uuid(parent_id[0], bpy.data.objects[parent_id[1]])
# Avoid reloading
if target.parent != parent and parent is not None:
target.parent = parent
elif target.parent:
target.parent = None
# Pose
if 'pose' in data:
if not target.pose:
@ -272,9 +557,23 @@ class BlObject(BlDatablock):
SKIN_DATA)
if hasattr(target, 'cycles_visibility') \
and 'cycles_visibility' in data:
and 'cycles_visibility' in data:
loader.load(target.cycles_visibility, data['cycles_visibility'])
if hasattr(target, 'modifiers'):
load_modifiers_custom_data(data['modifiers'], target.modifiers)
# PHYSICS
load_physics(data, target)
transform = data.get('transforms', None)
if transform:
target.matrix_parent_inverse = mathutils.Matrix(
transform['matrix_parent_inverse'])
target.matrix_basis = mathutils.Matrix(transform['matrix_basis'])
target.matrix_local = mathutils.Matrix(transform['matrix_local'])
def _dump_implementation(self, data, instance=None):
assert(instance)
@ -289,7 +588,6 @@ class BlObject(BlDatablock):
dumper.include_filter = [
"name",
"rotation_mode",
"parent",
"data",
"library",
"empty_display_type",
@ -304,8 +602,6 @@ class BlObject(BlDatablock):
"color",
"instance_collection",
"instance_type",
"location",
"scale",
'lock_location',
'lock_rotation',
'lock_scale',
@ -319,12 +615,16 @@ class BlObject(BlDatablock):
'show_all_edges',
'show_texture_space',
'show_in_front',
'type',
'rotation_quaternion' if instance.rotation_mode == 'QUATERNION' else 'rotation_euler',
'type'
]
data = dumper.dump(instance)
dumper.include_filter = [
'matrix_parent_inverse',
'matrix_local',
'matrix_basis']
data['transforms'] = dumper.dump(instance)
dumper.include_filter = [
'show_shadows',
]
@ -334,15 +634,14 @@ class BlObject(BlDatablock):
if self.is_library:
return data
# PARENTING
if instance.parent:
data['parent_uid'] = (instance.parent.uuid, instance.parent.name)
# MODIFIERS
modifiers = getattr(instance, 'modifiers', None)
if hasattr(instance, 'modifiers'):
data["modifiers"] = {}
modifiers = getattr(instance, 'modifiers', None)
if modifiers:
dumper.include_filter = None
dumper.depth = 1
for index, modifier in enumerate(modifiers):
data["modifiers"][modifier.name] = dumper.dump(modifier)
data['modifiers'] = dump_modifiers(modifiers)
gp_modifiers = getattr(instance, 'grease_pencil_modifiers', None)
@ -365,6 +664,7 @@ class BlObject(BlDatablock):
'location']
gp_mod_data['curve'] = curve_dumper.dump(modifier.curve)
# CONSTRAINTS
if hasattr(instance, 'constraints'):
dumper.include_filter = None
@ -409,7 +709,6 @@ class BlObject(BlDatablock):
bone_groups[group.name] = dumper.dump(group)
data['pose']['bone_groups'] = bone_groups
# VERTEx GROUP
if len(instance.vertex_groups) > 0:
data['vertex_groups'] = dump_vertex_groups(instance)
@ -417,36 +716,14 @@ class BlObject(BlDatablock):
# SHAPE KEYS
object_data = instance.data
if hasattr(object_data, 'shape_keys') and object_data.shape_keys:
dumper = Dumper()
dumper.depth = 2
dumper.include_filter = [
'reference_key',
'use_relative'
]
data['shape_keys'] = dumper.dump(object_data.shape_keys)
data['shape_keys']['reference_key'] = object_data.shape_keys.reference_key.name
key_blocks = {}
for key in object_data.shape_keys.key_blocks:
dumper.depth = 3
dumper.include_filter = [
'name',
'data',
'mute',
'value',
'slider_min',
'slider_max',
'data',
'co'
]
key_blocks[key.name] = dumper.dump(key)
key_blocks[key.name]['relative_key'] = key.relative_key.name
data['shape_keys']['key_blocks'] = key_blocks
data['shape_keys'] = dump_shape_keys(object_data.shape_keys)
# SKIN VERTICES
if hasattr(object_data, 'skin_vertices') and object_data.skin_vertices:
skin_vertices = list()
for skin_data in object_data.skin_vertices:
skin_vertices.append(np_dump_collection(skin_data.data, SKIN_DATA))
skin_vertices.append(
np_dump_collection(skin_data.data, SKIN_DATA))
data['skin_vertices'] = skin_vertices
# CYCLE SETTINGS
@ -461,6 +738,9 @@ class BlObject(BlDatablock):
]
data['cycles_visibility'] = dumper.dump(instance.cycles_visibility)
# PHYSICS
data.update(dump_physics(instance))
return data
def _resolve_deps_implementation(self):
@ -469,17 +749,25 @@ class BlObject(BlDatablock):
# Avoid Empty case
if self.instance.data:
deps.append(self.instance.data)
if self.instance.parent :
deps.append(self.instance.parent)
# Particle systems
for particle_slot in self.instance.particle_systems:
deps.append(particle_slot.settings)
if self.is_library:
deps.append(self.instance.library)
if self.instance.parent:
deps.append(self.instance.parent)
if self.instance.instance_type == 'COLLECTION':
# TODO: uuid based
deps.append(self.instance.instance_collection)
if self.instance.modifiers:
deps.extend(find_textures_dependencies(self.instance.modifiers))
deps.extend(find_geometry_nodes_dependencies(self.instance.modifiers))
if hasattr(self.instance.data, 'shape_keys') and self.instance.data.shape_keys:
deps.extend(resolve_animation_dependencies(self.instance.data.shape_keys))
return deps

View File

@ -0,0 +1,90 @@
import bpy
import mathutils
from . import dump_anything
from .bl_datablock import BlDatablock, get_datablock_from_uuid
def dump_textures_slots(texture_slots: bpy.types.bpy_prop_collection) -> list:
""" Dump every texture slot collection as the form:
[(index, slot_texture_uuid, slot_texture_name), (), ...]
"""
dumped_slots = []
for index, slot in enumerate(texture_slots):
if slot and slot.texture:
dumped_slots.append((index, slot.texture.uuid, slot.texture.name))
return dumped_slots
def load_texture_slots(dumped_slots: list, target_slots: bpy.types.bpy_prop_collection):
"""
"""
for index, slot in enumerate(target_slots):
if slot:
target_slots.clear(index)
for index, slot_uuid, slot_name in dumped_slots:
target_slots.create(index).texture = get_datablock_from_uuid(
slot_uuid, slot_name
)
IGNORED_ATTR = [
"is_embedded_data",
"is_evaluated",
"is_fluid",
"is_library_indirect",
"users"
]
class BlParticle(BlDatablock):
bl_id = "particles"
bl_class = bpy.types.ParticleSettings
bl_icon = "PARTICLES"
bl_check_common = False
bl_reload_parent = False
def _construct(self, data):
instance = bpy.data.particles.new(data["name"])
instance.uuid = self.uuid
return instance
def _load_implementation(self, data, target):
dump_anything.load(target, data)
dump_anything.load(target.effector_weights, data["effector_weights"])
# Force field
force_field_1 = data.get("force_field_1", None)
if force_field_1:
dump_anything.load(target.force_field_1, force_field_1)
force_field_2 = data.get("force_field_2", None)
if force_field_2:
dump_anything.load(target.force_field_2, force_field_2)
# Texture slots
load_texture_slots(data["texture_slots"], target.texture_slots)
def _dump_implementation(self, data, instance=None):
assert instance
dumper = dump_anything.Dumper()
dumper.depth = 1
dumper.exclude_filter = IGNORED_ATTR
data = dumper.dump(instance)
# Particle effectors
data["effector_weights"] = dumper.dump(instance.effector_weights)
if instance.force_field_1:
data["force_field_1"] = dumper.dump(instance.force_field_1)
if instance.force_field_2:
data["force_field_2"] = dumper.dump(instance.force_field_2)
# Texture slots
data["texture_slots"] = dump_textures_slots(instance.texture_slots)
return data
def _resolve_deps_implementation(self):
return [t.texture for t in self.instance.texture_slots if t and t.texture]

View File

@ -368,6 +368,8 @@ def load_sequence(sequence_data: dict, sequence_editor: bpy.types.SequenceEditor
class BlScene(BlDatablock):
is_root = True
bl_id = "scenes"
bl_class = bpy.types.Scene
bl_check_common = True

View File

@ -21,8 +21,8 @@ import mathutils
from .dump_anything import Loader, Dumper
from .bl_datablock import BlDatablock
from .bl_material import (load_shader_node_tree,
dump_shader_node_tree,
from .bl_material import (load_node_tree,
dump_node_tree,
get_node_tree_dependencies)
@ -44,7 +44,7 @@ class BlWorld(BlDatablock):
if target.node_tree is None:
target.use_nodes = True
load_shader_node_tree(data['node_tree'], target.node_tree)
load_node_tree(data['node_tree'], target.node_tree)
def _dump_implementation(self, data, instance=None):
assert(instance)
@ -58,7 +58,7 @@ class BlWorld(BlDatablock):
]
data = world_dumper.dump(instance)
if instance.use_nodes:
data['node_tree'] = dump_shader_node_tree(instance.node_tree)
data['node_tree'] = dump_node_tree(instance.node_tree)
return data

View File

@ -596,6 +596,8 @@ class Loader:
instance.write(bpy.data.textures.get(dump))
elif isinstance(rna_property_type, T.ColorRamp):
self._load_default(instance, dump)
elif isinstance(rna_property_type, T.NodeTree):
instance.write(bpy.data.node_groups.get(dump))
elif isinstance(rna_property_type, T.Object):
instance.write(bpy.data.objects.get(dump))
elif isinstance(rna_property_type, T.Mesh):
@ -608,6 +610,8 @@ class Loader:
instance.write(bpy.data.fonts.get(dump))
elif isinstance(rna_property_type, T.Sound):
instance.write(bpy.data.sounds.get(dump))
# elif isinstance(rna_property_type, T.ParticleSettings):
# instance.write(bpy.data.particles.get(dump))
def _load_matrix(self, matrix, dump):
matrix.write(mathutils.Matrix(dump))

@ -0,0 +1 @@
Subproject commit 8c27d0cec6b7db1756a7d142c94023fe20f352ff

View File

@ -32,6 +32,7 @@ from operator import itemgetter
from pathlib import Path
from queue import Queue
from time import gmtime, strftime
import traceback
try:
import _pickle as pickle
@ -44,9 +45,11 @@ from bpy.app.handlers import persistent
from bpy_extras.io_utils import ExportHelper, ImportHelper
from replication.constants import (COMMITED, FETCHED, RP_COMMON, STATE_ACTIVE,
STATE_INITIAL, STATE_SYNCING, UP)
from replication.data import ReplicatedDataFactory
from replication.exception import NonAuthorizedOperationError, ContextError
from replication.data import DataTranslationProtocol
from replication.exception import ContextError, NonAuthorizedOperationError
from replication.interface import session
from replication.porcelain import add, apply
from replication.repository import Repository
from . import bl_types, environment, timers, ui, utils
from .presence import SessionStatusWidget, renderer, view3d_find
@ -80,8 +83,8 @@ def initialize_session():
# Step 1: Constrect nodes
logging.info("Constructing nodes")
for node in session._graph.list_ordered():
node_ref = session.get(uuid=node)
for node in session.repository.list_ordered():
node_ref = session.repository.get_node(node)
if node_ref is None:
logging.error(f"Can't construct node {node}")
elif node_ref.state == FETCHED:
@ -89,8 +92,8 @@ def initialize_session():
# Step 2: Load nodes
logging.info("Loading nodes")
for node in session._graph.list_ordered():
node_ref = session.get(uuid=node)
for node in session.repository.list_ordered():
node_ref = session.repository.get_node(node)
if node_ref is None:
logging.error(f"Can't load node {node}")
@ -186,7 +189,7 @@ class SessionStartOperator(bpy.types.Operator):
handler.setFormatter(formatter)
bpy_factory = ReplicatedDataFactory()
bpy_protocol = DataTranslationProtocol()
supported_bl_types = []
# init the factory with supported types
@ -205,22 +208,17 @@ class SessionStartOperator(bpy.types.Operator):
type_local_config = settings.supported_datablocks[type_impl_name]
bpy_factory.register_type(
bpy_protocol.register_type(
type_module_class.bl_class,
type_module_class,
check_common=type_module_class.bl_check_common)
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
if bpy.app.version[1] >= 91:
python_binary_path = sys.executable
else:
python_binary_path = bpy.app.binary_path_python
session.configure(
factory=bpy_factory,
python_path=python_binary_path,
external_update_handling=True)
repo = Repository(data_protocol=bpy_protocol)
# Host a session
if self.host:
@ -231,13 +229,14 @@ class SessionStartOperator(bpy.types.Operator):
runtime_settings.internet_ip = environment.get_ip()
try:
# Init repository
for scene in bpy.data.scenes:
session.add(scene)
add(repo, scene)
session.host(
repository= repo,
id=settings.username,
port=settings.port,
ipc_port=settings.ipc_port,
timeout=settings.connection_timeout,
password=admin_pass,
cache_directory=settings.cache_directory,
@ -247,7 +246,6 @@ class SessionStartOperator(bpy.types.Operator):
except Exception as e:
self.report({'ERROR'}, repr(e))
logging.error(f"Error: {e}")
import traceback
traceback.print_exc()
# Join a session
else:
@ -258,10 +256,10 @@ class SessionStartOperator(bpy.types.Operator):
try:
session.connect(
repository= repo,
id=settings.username,
address=settings.ip,
port=settings.port,
ipc_port=settings.ipc_port,
timeout=settings.connection_timeout,
password=admin_pass
)
@ -272,6 +270,7 @@ class SessionStartOperator(bpy.types.Operator):
# Background client updates service
deleyables.append(timers.ClientUpdate())
deleyables.append(timers.DynamicRightSelectTimer())
deleyables.append(timers.ApplyTimer(timeout=settings.depsgraph_update_rate))
# deleyables.append(timers.PushTimer(
# queue=stagging,
# timeout=settings.depsgraph_update_rate
@ -280,7 +279,9 @@ class SessionStartOperator(bpy.types.Operator):
session_user_sync = timers.SessionUserSync()
session_background_executor = timers.MainThreadExecutor(
execution_queue=background_execution_queue)
session_listen = timers.SessionListenTimer(timeout=0.001)
session_listen.register()
session_update.register()
session_user_sync.register()
session_background_executor.register()
@ -288,7 +289,7 @@ class SessionStartOperator(bpy.types.Operator):
deleyables.append(session_background_executor)
deleyables.append(session_update)
deleyables.append(session_user_sync)
deleyables.append(session_listen)
self.report(
@ -329,7 +330,7 @@ class SessionInitOperator(bpy.types.Operator):
utils.clean_scene()
for scene in bpy.data.scenes:
session.add(scene)
add(session.repository, scene)
session.init()
@ -351,7 +352,7 @@ class SessionStopOperator(bpy.types.Operator):
if session:
try:
session.disconnect()
session.disconnect(reason='user')
except Exception as e:
self.report({'ERROR'}, repr(e))
@ -600,17 +601,22 @@ class SessionApply(bpy.types.Operator):
def execute(self, context):
logging.debug(f"Running apply on {self.target}")
try:
node_ref = session.get(uuid=self.target)
session.apply(self.target,
force=True,
force_dependencies=self.reset_dependencies)
node_ref = session.repository.get_node(self.target)
apply(session.repository,
self.target,
force=True,
force_dependencies=self.reset_dependencies)
if node_ref.bl_reload_parent:
for parent in session._graph.find_parents(self.target):
for parent in session.repository.get_parents(self.target):
logging.debug(f"Refresh parent {parent}")
session.apply(parent, force=True)
apply(session.repository,
parent.uuid,
force=True)
except Exception as e:
self.report({'ERROR'}, repr(e))
return {"CANCELED"}
traceback.print_exc()
return {"CANCELLED"}
return {"FINISHED"}
@ -650,15 +656,15 @@ class ApplyArmatureOperator(bpy.types.Operator):
return {'CANCELLED'}
if event.type == 'TIMER':
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
nodes = session.list(filter=bl_types.bl_armature.BlArmature)
for node in nodes:
node_ref = session.get(uuid=node)
node_ref = session.repository.get_node(node)
if node_ref.state == FETCHED:
try:
session.apply(node)
apply(session.repository, node)
except Exception as e:
logging.error("Fail to apply armature: {e}")
@ -795,7 +801,7 @@ class SessionSaveBackupOperator(bpy.types.Operator, ExportHelper):
@classmethod
def poll(cls, context):
return session.state['STATE'] == STATE_ACTIVE
return session.state == STATE_ACTIVE
class SessionStopAutoSaveOperator(bpy.types.Operator):
bl_idname = "session.cancel_autosave"
@ -804,7 +810,7 @@ class SessionStopAutoSaveOperator(bpy.types.Operator):
@classmethod
def poll(cls, context):
return (session.state['STATE'] == STATE_ACTIVE and 'SessionBackupTimer' in registry)
return (session.state == STATE_ACTIVE and 'SessionBackupTimer' in registry)
def execute(self, context):
autosave_timer = registry.get('SessionBackupTimer')
@ -829,7 +835,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
)
def execute(self, context):
from replication.graph import ReplicationGraph
from replication.repository import Repository
# TODO: add filechecks
@ -849,7 +855,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
# init the factory with supported types
bpy_factory = ReplicatedDataFactory()
bpy_protocol = DataTranslationProtocol()
for type in bl_types.types_to_register():
type_module = getattr(bl_types, type)
name = [e.capitalize() for e in type.split('_')[1:]]
@ -857,16 +863,16 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
type_module_class = getattr(type_module, type_impl_name)
bpy_factory.register_type(
bpy_protocol.register_type(
type_module_class.bl_class,
type_module_class)
graph = ReplicationGraph()
graph = Repository()
for node, node_data in nodes:
node_type = node_data.get('str_type')
impl = bpy_factory.get_implementation_from_net(node_type)
impl = bpy_protocol.get_implementation_from_net(node_type)
if impl:
logging.info(f"Loading {node}")
@ -874,7 +880,7 @@ class SessionLoadSaveOperator(bpy.types.Operator, ImportHelper):
uuid=node,
dependencies=node_data['dependencies'],
data=node_data['data'])
instance.store(graph)
graph.do_commit(instance)
instance.state = FETCHED
logging.info("Graph succefully loaded")
@ -923,7 +929,7 @@ classes = (
def update_external_dependencies():
nodes_ids = session.list(filter=bl_types.bl_file.BlFile)
for node_id in nodes_ids:
node = session.get(node_id)
node = session.repository.get_node(node_id)
if node and node.owner in [session.id, RP_COMMON] \
and node.has_changed():
session.commit(node_id)
@ -932,11 +938,11 @@ def update_external_dependencies():
def sanitize_deps_graph(remove_nodes: bool = False):
""" Cleanup the replication graph
"""
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
start = utils.current_milli_time()
rm_cpt = 0
for node_key in session.list():
node = session.get(node_key)
node = session.repository.get_node(node_key)
if node is None \
or (node.state == UP and not node.resolve(construct=False)):
if remove_nodes:
@ -957,18 +963,18 @@ def resolve_deps_graph(dummy):
A future solution should be to avoid storing dataclock reference...
"""
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
sanitize_deps_graph(remove_nodes=True)
@persistent
def load_pre_handler(dummy):
if session and session.state['STATE'] in [STATE_ACTIVE, STATE_SYNCING]:
if session and session.state in [STATE_ACTIVE, STATE_SYNCING]:
bpy.ops.session.stop()
@persistent
def update_client_frame(scene):
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
session.update_user_metadata({
'frame_current': scene.frame_current
})
@ -976,7 +982,7 @@ def update_client_frame(scene):
@persistent
def depsgraph_evaluation(scene):
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
context = bpy.context
blender_depsgraph = bpy.context.view_layer.depsgraph
dependency_updates = [u for u in blender_depsgraph.updates]
@ -989,13 +995,13 @@ def depsgraph_evaluation(scene):
# Is the object tracked ?
if update.id.uuid:
# Retrieve local version
node = session.get(uuid=update.id.uuid)
node = session.repository.get_node(update.id.uuid)
# Check our right on this update:
# - if its ours or ( under common and diff), launch the
# update process
# - if its to someone else, ignore the update
if node and node.owner in [session.id, RP_COMMON]:
if node and (node.owner == session.id or node.bl_check_common):
if node.state == UP:
try:
if node.has_changed():
@ -1013,11 +1019,11 @@ def depsgraph_evaluation(scene):
continue
# A new scene is created
elif isinstance(update.id, bpy.types.Scene):
ref = session.get(reference=update.id)
ref = session.repository.get_node_by_datablock(update.id)
if ref:
ref.resolve()
else:
scn_uuid = session.add(update.id)
scn_uuid = add(session.repository, update.id)
session.commit(scn_uuid)
session.push(scn_uuid, check_data=False)
def register():
@ -1035,7 +1041,7 @@ def register():
def unregister():
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
session.disconnect()
from bpy.utils import unregister_class

View File

@ -66,14 +66,6 @@ def update_ip(self, context):
self['ip'] = "127.0.0.1"
def update_port(self, context):
max_port = self.port + 3
if self.ipc_port < max_port and \
self['ipc_port'] >= self.port:
logging.error(
"IPC Port in conflict with the port, assigning a random value")
self['ipc_port'] = random.randrange(self.port+4, 10000)
def update_directory(self, context):
@ -174,12 +166,6 @@ class SessionPrefs(bpy.types.AddonPreferences):
supported_datablocks: bpy.props.CollectionProperty(
type=ReplicatedDatablock,
)
ipc_port: bpy.props.IntProperty(
name="ipc_port",
description='internal ttl port(only useful for multiple local instances)',
default=random.randrange(5570, 70000),
update=update_port,
)
init_method: bpy.props.EnumProperty(
name='init_method',
description='Init repo',
@ -195,7 +181,7 @@ class SessionPrefs(bpy.types.AddonPreferences):
connection_timeout: bpy.props.IntProperty(
name='connection timeout',
description='connection timeout before disconnection',
default=1000
default=5000
)
# Replication update settings
depsgraph_update_rate: bpy.props.FloatProperty(

View File

@ -30,7 +30,7 @@ import mathutils
from bpy_extras import view3d_utils
from gpu_extras.batch import batch_for_shader
from replication.constants import (STATE_ACTIVE, STATE_AUTH, STATE_CONFIG,
STATE_INITIAL, STATE_LAUNCHING_SERVICES,
STATE_INITIAL, CONNECTING,
STATE_LOBBY, STATE_QUITTING, STATE_SRV_SYNC,
STATE_SYNCING, STATE_WAITING)
from replication.interface import session
@ -399,7 +399,7 @@ class SessionStatusWidget(Widget):
text_scale = self.preferences.presence_hud_scale
ui_scale = bpy.context.preferences.view.ui_scale
color = [1, 1, 0, 1]
state = session.state.get('STATE')
state = session.state
state_str = f"{get_state_str(state)}"
if state == STATE_ACTIVE:

View File

@ -17,13 +17,14 @@
import logging
import sys
import traceback
import bpy
from replication.constants import (FETCHED, RP_COMMON, STATE_ACTIVE,
STATE_INITIAL, STATE_LOBBY, STATE_QUITTING,
STATE_SRV_SYNC, STATE_SYNCING, UP)
from replication.exception import NonAuthorizedOperationError, ContextError
from replication.interface import session
from replication.porcelain import apply, add
from . import operators, utils
from .presence import (UserFrustumWidget, UserNameWidget, UserSelectionWidget,
@ -71,7 +72,7 @@ class Timer(object):
except Exception as e:
logging.error(e)
self.unregister()
session.disconnect()
session.disconnect(reason=f"Error during timer {self.id} execution")
else:
if self.is_running:
return self._timeout
@ -100,24 +101,31 @@ class SessionBackupTimer(Timer):
def execute(self):
session.save(self._filepath)
class SessionListenTimer(Timer):
def execute(self):
session.listen()
class ApplyTimer(Timer):
def execute(self):
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
nodes = session.list()
for node in nodes:
node_ref = session.get(uuid=node)
node_ref = session.repository.get_node(node)
if node_ref.state == FETCHED:
try:
session.apply(node)
apply(session.repository, node)
except Exception as e:
logging.error(f"Fail to apply {node_ref.uuid}: {e}")
logging.error(f"Fail to apply {node_ref.uuid}")
traceback.print_exc()
else:
if node_ref.bl_reload_parent:
for parent in session._graph.find_parents(node):
for parent in session.repository.get_parents(node):
logging.debug("Refresh parent {node}")
session.apply(parent, force=True)
apply(session.repository,
parent.uuid,
force=True)
class DynamicRightSelectTimer(Timer):
@ -130,7 +138,7 @@ class DynamicRightSelectTimer(Timer):
def execute(self):
settings = utils.get_preferences()
if session and session.state['STATE'] == STATE_ACTIVE:
if session and session.state == STATE_ACTIVE:
# Find user
if self._user is None:
self._user = session.online_users.get(settings.username)
@ -144,7 +152,7 @@ class DynamicRightSelectTimer(Timer):
# if an annotation exist and is tracked
if annotation_gp and annotation_gp.uuid:
registered_gp = session.get(uuid=annotation_gp.uuid)
registered_gp = session.repository.get_node(annotation_gp.uuid)
if is_annotating(bpy.context):
# try to get the right on it
if registered_gp.owner == RP_COMMON:
@ -158,7 +166,7 @@ class DynamicRightSelectTimer(Timer):
affect_dependencies=False)
if registered_gp.owner == settings.username:
gp_node = session.get(uuid=annotation_gp.uuid)
gp_node = session.repository.get_node(annotation_gp.uuid)
if gp_node.has_changed():
session.commit(gp_node.uuid)
session.push(gp_node.uuid, check_data=False)
@ -182,7 +190,7 @@ class DynamicRightSelectTimer(Timer):
# change old selection right to common
for obj in obj_common:
node = session.get(uuid=obj)
node = session.repository.get_node(obj)
if node and (node.owner == settings.username or node.owner == RP_COMMON):
recursive = True
@ -200,7 +208,7 @@ class DynamicRightSelectTimer(Timer):
# change new selection to our
for obj in obj_ours:
node = session.get(uuid=obj)
node = session.repository.get_node(obj)
if node and node.owner == RP_COMMON:
recursive = True
@ -233,7 +241,7 @@ class DynamicRightSelectTimer(Timer):
owned_keys = session.list(
filter_owner=settings.username)
for key in owned_keys:
node = session.get(uuid=key)
node = session.repository.get_node(key)
try:
session.change_owner(
key,
@ -262,7 +270,7 @@ class ClientUpdate(Timer):
settings = utils.get_preferences()
if session and renderer:
if session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]:
if session.state in [STATE_ACTIVE, STATE_LOBBY]:
local_user = session.online_users.get(
settings.username)

View File

@ -26,7 +26,7 @@ from replication.constants import (ADDED, ERROR, FETCHED,
STATE_INITIAL, STATE_SRV_SYNC,
STATE_WAITING, STATE_QUITTING,
STATE_LOBBY,
STATE_LAUNCHING_SERVICES)
CONNECTING)
from replication import __version__
from replication.interface import session
from .timers import registry
@ -71,9 +71,9 @@ class SESSION_PT_settings(bpy.types.Panel):
def draw_header(self, context):
layout = self.layout
if session and session.state['STATE'] != STATE_INITIAL:
if session and session.state != STATE_INITIAL:
cli_state = session.state
state = session.state.get('STATE')
state = session.state
connection_icon = "KEYTYPE_MOVING_HOLD_VEC"
if state == STATE_ACTIVE:
@ -81,7 +81,7 @@ class SESSION_PT_settings(bpy.types.Panel):
else:
connection_icon = 'PROP_CON'
layout.label(text=f"Session - {get_state_str(cli_state['STATE'])}", icon=connection_icon)
layout.label(text=f"Session - {get_state_str(cli_state)}", icon=connection_icon)
else:
layout.label(text=f"Session - v{__version__}",icon="PROP_OFF")
@ -94,13 +94,13 @@ class SESSION_PT_settings(bpy.types.Panel):
if hasattr(context.window_manager, 'session'):
# STATE INITIAL
if not session \
or (session and session.state['STATE'] == STATE_INITIAL):
or (session and session.state == STATE_INITIAL):
pass
else:
cli_state = session.state
progress = session.state_progress
row = layout.row()
current_state = cli_state['STATE']
current_state = session.state
info_msg = None
if current_state in [STATE_ACTIVE]:
@ -124,8 +124,8 @@ class SESSION_PT_settings(bpy.types.Panel):
if current_state in [STATE_SYNCING, STATE_SRV_SYNC, STATE_WAITING]:
info_box = row.box()
info_box.row().label(text=printProgressBar(
cli_state['CURRENT'],
cli_state['TOTAL'],
progress['current'],
progress['total'],
length=16
))
@ -141,7 +141,7 @@ class SESSION_PT_settings_network(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
or (session and session.state == 0)
def draw_header(self, context):
self.layout.label(text="", icon='URL')
@ -199,7 +199,7 @@ class SESSION_PT_settings_user(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
or (session and session.state == 0)
def draw_header(self, context):
self.layout.label(text="", icon='USER')
@ -230,7 +230,7 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] == 0)
or (session and session.state == 0)
def draw_header(self, context):
self.layout.label(text="", icon='PREFERENCES')
@ -251,9 +251,6 @@ class SESSION_PT_advanced_settings(bpy.types.Panel):
emboss=False)
if settings.sidebar_advanced_net_expanded:
net_section_row = net_section.row()
net_section_row.label(text="IPC Port:")
net_section_row.prop(settings, "ipc_port", text="")
net_section_row = net_section.row()
net_section_row.label(text="Timeout (ms):")
net_section_row.prop(settings, "connection_timeout", text="")
@ -322,7 +319,7 @@ class SESSION_PT_user(bpy.types.Panel):
@classmethod
def poll(cls, context):
return session and session.state['STATE'] in [STATE_ACTIVE, STATE_LOBBY]
return session and session.state in [STATE_ACTIVE, STATE_LOBBY]
def draw_header(self, context):
self.layout.label(text="", icon='USER')
@ -353,7 +350,7 @@ class SESSION_PT_user(bpy.types.Panel):
if active_user != 0 and active_user.username != settings.username:
row = layout.row()
user_operations = row.split()
if session.state['STATE'] == STATE_ACTIVE:
if session.state == STATE_ACTIVE:
user_operations.alert = context.window_manager.session.time_snap_running
user_operations.operator(
@ -411,7 +408,7 @@ class SESSION_PT_presence(bpy.types.Panel):
@classmethod
def poll(cls, context):
return not session \
or (session and session.state['STATE'] in [STATE_INITIAL, STATE_ACTIVE])
or (session and session.state in [STATE_INITIAL, STATE_ACTIVE])
def draw_header(self, context):
self.layout.prop(context.window_manager.session,
@ -441,7 +438,7 @@ class SESSION_PT_presence(bpy.types.Panel):
def draw_property(context, parent, property_uuid, level=0):
settings = get_preferences()
runtime_settings = context.window_manager.session
item = session.get(uuid=property_uuid)
item = session.repository.get_node(property_uuid)
area_msg = parent.row(align=True)
@ -519,8 +516,8 @@ class SESSION_PT_repository(bpy.types.Panel):
admin = usr['admin']
return hasattr(context.window_manager, 'session') and \
session and \
(session.state['STATE'] == STATE_ACTIVE or \
session.state['STATE'] == STATE_LOBBY and admin)
(session.state == STATE_ACTIVE or \
session.state == STATE_LOBBY and admin)
def draw_header(self, context):
self.layout.label(text="", icon='OUTLINER_OB_GROUP_INSTANCE')
@ -536,7 +533,7 @@ class SESSION_PT_repository(bpy.types.Panel):
row = layout.row()
if session.state['STATE'] == STATE_ACTIVE:
if session.state == STATE_ACTIVE:
if 'SessionBackupTimer' in registry:
row.alert = True
row.operator('session.cancel_autosave', icon="CANCEL")
@ -568,7 +565,7 @@ class SESSION_PT_repository(bpy.types.Panel):
filter_owner=settings.username) if runtime_settings.filter_owned else session.list()
client_keys = [key for key in key_to_filter
if session.get(uuid=key).str_type
if session.repository.get_node(key).str_type
in types_filter]
if client_keys:
@ -579,7 +576,7 @@ class SESSION_PT_repository(bpy.types.Panel):
else:
row.label(text="Empty")
elif session.state['STATE'] == STATE_LOBBY and usr and usr['admin']:
elif session.state == STATE_LOBBY and usr and usr['admin']:
row.operator("session.init", icon='TOOL_SETTINGS', text="Init")
else:
row.label(text="Waiting to start")

View File

@ -36,7 +36,7 @@ from replication.constants import (STATE_ACTIVE, STATE_AUTH,
STATE_INITIAL, STATE_SRV_SYNC,
STATE_WAITING, STATE_QUITTING,
STATE_LOBBY,
STATE_LAUNCHING_SERVICES)
CONNECTING)
def find_from_attr(attr_name, attr_value, list):
@ -92,7 +92,7 @@ def get_state_str(state):
state_str = 'OFFLINE'
elif state == STATE_QUITTING:
state_str = 'QUITTING'
elif state == STATE_LAUNCHING_SERVICES:
elif state == CONNECTING:
state_str = 'LAUNCHING SERVICES'
elif state == STATE_LOBBY:
state_str = 'LOBBY'

View File

@ -13,7 +13,7 @@ def main():
if len(sys.argv) > 2:
blender_rev = sys.argv[2]
else:
blender_rev = "2.91.0"
blender_rev = "2.92.0"
try:
exit_val = BAT.test_blender_addon(addon_path=addon, blender_revision=blender_rev)

View File

@ -8,6 +8,7 @@ import random
from multi_user.bl_types.bl_action import BlAction
INTERPOLATION = ['CONSTANT', 'LINEAR', 'BEZIER', 'SINE', 'QUAD', 'CUBIC', 'QUART', 'QUINT', 'EXPO', 'CIRC', 'BACK', 'BOUNCE', 'ELASTIC']
FMODIFIERS = ['GENERATOR', 'FNGENERATOR', 'ENVELOPE', 'CYCLES', 'NOISE', 'LIMITS', 'STEPPED']
# @pytest.mark.parametrize('blendname', ['test_action.blend'])
def test_action(clear_blend):
@ -22,6 +23,9 @@ def test_action(clear_blend):
point.co[1] = random.randint(-10,10)
point.interpolation = INTERPOLATION[random.randint(0, len(INTERPOLATION)-1)]
for mod_type in FMODIFIERS:
fcurve_sample.modifiers.new(mod_type)
bpy.ops.mesh.primitive_plane_add()
bpy.data.objects[0].animation_data_create()
bpy.data.objects[0].animation_data.action = datablock

View File

@ -7,7 +7,7 @@ import bpy
import random
from multi_user.bl_types.bl_object import BlObject
# Removed 'BUILD' modifier because the seed doesn't seems to be
# Removed 'BUILD', 'SOFT_BODY' modifier because the seed doesn't seems to be
# correctly initialized (#TODO: report the bug)
MOFIFIERS_TYPES = [
'DATA_TRANSFER', 'MESH_CACHE', 'MESH_SEQUENCE_CACHE',
@ -22,8 +22,7 @@ MOFIFIERS_TYPES = [
'MESH_DEFORM', 'SHRINKWRAP', 'SIMPLE_DEFORM', 'SMOOTH',
'CORRECTIVE_SMOOTH', 'LAPLACIANSMOOTH', 'SURFACE_DEFORM',
'WARP', 'WAVE', 'CLOTH', 'COLLISION', 'DYNAMIC_PAINT',
'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE',
'SOFT_BODY', 'SURFACE']
'EXPLODE', 'FLUID', 'OCEAN', 'PARTICLE_INSTANCE', 'SURFACE']
GP_MODIFIERS_TYPE = [
'GP_ARRAY', 'GP_BUILD', 'GP_MIRROR', 'GP_MULTIPLY',
@ -72,5 +71,5 @@ def test_object(clear_blend):
test = implementation._construct(expected)
implementation._load(expected, test)
result = implementation._dump(test)
print(DeepDiff(expected, result))
assert not DeepDiff(expected, result)