Content - DELMIA - Ergonomics at Work Essentials
Content - DELMIA - Ergonomics at Work Essentials
App layout
About the interface:
Objects type
Definition for PPR, product and resources.
Immersive Browser
Panel use to create the building blocks of your simulations. Stack of panel on the left side of the screen.
Active
Simulation Object (ASO) shows which object is the ASO and
provides tools to select a different object. Any work you do in
the panels Is done to the active simulation object (ASO). When
you open a 3D simulation, the object you have selected immediately prior to opening a 3D simulation determines
what your initial ASO is.
Probes are items that collect data (such as measurements) or perform data
analysis (such as clash analysis). A dynamic clash probe is available by
default. Dynamic clash detection is performed between the moving parts
against the rest of the entities in the viewer. The information a probe
provides only appears in 3D when the probe is part of a scenario.
Behavior: allows you to create new human tasks or activities. It also stores
any previously created human task or activities.
The simulation state will be saved under the Simulation States tab in the immersive browser.
Once a simulation state is created for the ASO it is possible to restore the initial
state if an object is moved. This can be helpful if an object is moved to an
undesired location or orientation. To reset the simulation state:
At the same time, also consider objects that you do not want to be captured in the state. For more complex
scenarios you may want to set the ASO to a low-level object in the tree (a manikin) and create a composite
simulation state to add in other resources or products as needed. This will allow you to apply the state and restore
positions with minimal impact on other objects in the workspace. If you need to capture the state of an object that
does not share a common parent with the current ASO, create a composite simulation state.
With track: you need to manually trace a walk path for the manikin in the
workspace.
Human Location: You can use a Human Location as a target for the manikin to
walk to.
GOTO activity panel will appear, and the robot will snap to the
manikin’s feet
6. Start creating a walk path by dragging and rotating the robot. While
creating a path, be sure that the manikin does not walk directly
thought objects in workspace.
7. Controls points can be added to the walk path to help adjust its
trajectory. Click the walk path to add a control point.
GOTO with human location
Gesture Activity
Macro Simulation Activities
The manikin can go and perform a gesture to avoid o interact with an object such as a table with a GOTO and GET
activity. For more complex simulation activities, GOTO and GET can be included within a single macro activity as a
phase.
Complex macro activities will be broken down into the following four phases:
1. Wall To (GOTO):
Define a walk path or use a human location HL (similar the GOTO activity).
2. Preparation (GESTURE):
Sequence of key postures to prepare for the action (gesture activity).
3. Action (GET, PUT, USETOOL, REACH):
Pick up or drop an object, reach and other such actions.
4. Return (GESTURE):
Sequence of key postures to complete the action (gesture activity).
The following table shows how an object will be attached to the manikin and during
which
phase of
the
activity:
GET Activity
GET is an activity that allows the manikin’s hand to pick a product or resource.
The GET action requires the usage of human interfaces (HI).
The object involved in the GET activity will be attached to the manikin’s hand
until a PUT activity occurs.
A GET activity can be created as a macro activity as seen in the previous lesson.
During a GET activity the manikin can: