Immersive Analytics – Future Interactions of Hybrid and Augmented Reality


Immersive Analytics (IA) has made a lot of advancement and a great deal of research is thrusted towards it due to the vast opportunities that this technology has to offer in analytical reasoning, exploration of data and in decision making. IA is there for others to simplify the known data of a system in a more visual and elaborated tangible manner, making it more interactive and engaging, establishing a realistic feeling in an alternate reality.

A new multi-disciplinary creative exploration into the future of interaction technologies, Immersive Analytics (IA) converges information visualization, visual analytics, virtual and augmented reality and natural user interfaces. While zeroing on an enterprise software, businesses stress specifically on usability.

An end user looks for accessibility of accurate data, in appropriate format, through most conducive channel within the right context – mainly when it is about Business Intelligence (BI) and analytics application. Designers need to break the glass ceiling of usability and move past the conventional keyboard and mouse, and experiment with innovative modes that are more innate and natural such as the visual, vocal and hand movements.

There is an imbalance in the human analytical capabilities and in demand volumes, variations and pace, this calls us to explore more about the computing machines to delegate the tasks like transformation, representation, condense and convert the information into assimilated and actionable bytes, mainly focusing on the scalability of IA.

Organic Interactions

Of late, it is evidently clear from the boom of Alexa and Echo that voice modulation is the most inherent mode of communication for human operators. Similarly, the big question arises that is it possible to just ask the BI suite to pull up report only through speech? For building the GenY IA interface for analytical tools, the top priority is to have a first layer of input that is compatible in supporting multiple Natural Language (NL) modalities, whether voice, visual or hand gesture or a mix of Virtual reality (VR) and mixed reality (MR).

A user is most likely to use more than one IA modality at any given time which gives them more natural interactions to emulate the human-human communication characteristics, making the interaction more intuitive, easier and user friendly. It is then that the multi-modal interaction layer orchestrates a vivid programmed command after interpreting the end user IA inputs.

Once the user’s inputs are captured, the first layer analyzes the recorded information and preserves it to be readily construed by the next layer where they are decoded into executable commands and deliver tasks for analytical applications.

Intelligent Interface

For user experience (UX) to be more engaging the user interface (UI) is supposed to be agile to strike a gripping conversation. Primarily, the UI must be capable enough of seamlessly switch over from one major natural language modality to the other one. Task execution framework of IA is quite crucial for the overall UX to be seamless. For NL interface, the execution tasks are either based on dialogue, a two-way information exchange where the output is generated based on information received and oriented towards retrieval of canned response and report, and generates insights depending on precise queries.

For gesture-based interactions, IR sensors and cams seize the inputs and decode them into conventional action for AI same as cursor clicks and moves. For a better immersive analytics report, the output can be given in the form of VR/MR, in which case the end user can interact with the visual data for virtual UX.

Keeping in mind the predominant trend of end user expectations, the GenY IA interface for AI in all probabilities will come with in-built platform intelligence. Using the machine learning algorithms that are both through supervised and unsupervised methods, the system can enhance its responses based on queries, search-strings and through sentiment analysis – constantly adding to its capability environment.

Conclusion

Past decade has witnessed a swift advancement and wide adoption of immersive devices like the HoloLens and Oculus Rift that have heavy capabilities to expressively outspread the methodology of revolutionized visual analytics through critical 3D context data and developing a sense of virtual presence. The immersive virtual world has no boundary; while time and scale are pliable. VR is undoubtedly a viable and vital technology when implemented accurately to understand the remote and visual importance of each component of an asset. However, if the product is detached from its entire lifecycle, IA leaves the engineer and staff in a fix.

Data encryption standards must be bound by defined storage, dissemination and retrieval policies. Top it all, these standards must be backed by watertight governance framework that has periodic provision of audits, and strict regulations to strongly maintain the records of usage. The discussion is only a sneak peek into the future of IA interface for composite business tool.

References

http://sites.tcs.com/blogs/agile-business/speaking-analytics-reengineering-man-machine-interactions/
https://www.ibm.com/blogs/internet-of-things/immersive-analytics-digital-twin/
http://www.powermag.com/immersive-visual-environments-advanced-analytics-drive-operational-optimization/

 


Leave a Reply

Your email address will not be published.