Hello,
as ARIS supports
- model
- simulate
- use
my focus is interaction.
How to interact with the ARIS system including the generated implementations.
Currently common forms of interactions are
- desktop (screen, keyboard, mouse)
- tablet / smartphone (touch screen)
future forms of interaction might be
- gestures (as e.g. Microsoft Kinect provides)
- speech
- or VR as combination of the above.
At the very moment ARIS only supports desktop. yes, I am aware of the IDS Scheer EPC viewer application. But this supports only viewing, not interaction.
VR will break the 2D limitation of screens. 3D becomes reality. This not only enables crossing free graphs, while modeling:
- A modeler can walk through models and edit the model by gestures and speech.
- A modeler can walk through simulations and alter parameters by gestures and speech.
- A user can walk through processes and provide required information by gestures and speech.
A fully VR ARIS is my vision of "ARIS in 10 years".
Best regards
Carsten
Hi Carsten,
thanks for your idea, VR ARIS sounds great!
Can you please add a visual element to your post so (as described in the contest rules) so that we can include your submission in our online voting?
Thanks in advance and good luck,
Christina
Hi Christina,
as promised my drawings ;-)
So my rationales simply are
* 3D makes 7+-2 more feasable (https://en.wikipedia.org/wiki/The_Magical_Number_Seven,_Plus_or_Minus_Two)
* VR enables more natural interaction
* do not sit, keep moving
Best regards
Carsten
Many, many thanks Christina,
... and now the journey of UMeL Actor through the model illustrated with hiRes scans. Regarding BPMN as an UML profile (http://www.omg.org/spec/BPMNProfile/) ;-)
Sorry, but I got no better than FAX quality out of scanner I had in access yesterday.
Best regards
Carsten