Jibe platform walkthrough

I spent some time in recent weeks with John “Pathfinder” Lester, formerly of Linden Lab and now Director of Community Development at Reaction Grid. The purpose was for a walkthrough of the Jibe platform, which is a merger of Unity3D and other technologies to make a more comprehensive learning experience. I hooked up with John for a tour as I’m actively looking for a platform on which to base my upcoming research.

To say the walkthrough was a revelation was a bit of an understatement. I’d made the decision more than a year ago that I wouldn’t be using Second Life as the platform for my proposed simulation environment. My reasons for that are numerous but it basically came down to fine detail – I’ve seen a few demos of the Unity3D engine in action and for what I’m looking for it’s a markedly superior option, even factoring in Second Life’s potential development path over the coming 2-3 years. So, I was working under the assumption of a Unity3D only option – until I checked out Jibe.

So what is it? I’ll quote the official blurb as it summarises it pretty well:

The Jibe platform is an extensible architecture that uses a middleware abstraction layer to communicate with multiple backend systems (currently SmartFox & Photon) and frontends (currently Unity3D)

For the layperson, it means that aside from the 3D environment you’re interacting with, Jibe can bring in data from other systems. Whether it’s web content, 3D or other graphical content, it can be integrated into the viewing experience. Here’s a graphic demonstrating it:


(Click on the picture for full size)

For the educator / clinician wanting to create an immersive and realistic simulation, all this is essentially technical information that doesn’t need to be known. Which leads me to what impressed most with Jibe: the interface itself. Although I’d argue the overall browsing interface could do with some input from a designer (read: it’s not pretty), it does intrinsically work and removes a lot of the downsides research to date has shown about using Second Life or OpenSim on its own. Let’s use an example:

(Click here for the full size version)

What you’re seeing here is my avatar standing in front of a model of a virus. In this case it’s the Human Immunodeficiency Virus (HIV). I can rotate the virus, check out all it’s aspects and when clicking on it, related information (in this case simple Wikipedia entry on HIV) opens in the same browser window I’m using. Obviously it doesn’t have to be Wikipedia – the sky’s the limit. Above the Unity window are a bunch of social media buttons allowing you to share information you’re interacting with.

If you refer to the Jibe graphic further above, remember that essentially anything can be bolted into the Jibe platform. If I were to use this platform I’d be looking at some sort of database connection that allowed me to have common clinical pathways viewed step by step as an avatar completes the task. You can also choose to have key content networked or local, meaning you could ‘phase’content for people progressing through the simulation at different paces. It’s all doable, it just takes time to set-up. As far as creating content, any pre-fab stuff from the Unity3D store can be pulled into Jibe.

If you’re interested in finding out more, this introduction to Jibe is useful. As far as pricing goes, check out the ReactionGrid Shop for options.

After spending around an hour being shown around Jibe, it really struck me how far advanced it was as a tool for educators compared to Second Life in particular. That statement is very dependent on the type of education occurring, but for more intricate work, the Unity3D interface combined with the use of well recognised standards for better interoperability, makes Jibe a very tasty option indeed.

Comments

  1. I don’t have any fundamental objection to Jibe/Unity3D but don’t see how in your example it “removes a lot of the downsides research to date has shown about using Second Life or OpenSim on its own”. You imported a mesh, rotated it and linked to a web page. That functionality has been present in OpenSim (and, of course, SL) for a good while. There are substantial generic mesh repositories and you can make your own molecules (and maybe viruses) using UCSF Chimera and MeshLab. Good luck with Jibe anyway but please do explain your reasoning more fully in future.

  2. One fundamental difference between Jibe and SecondLife/Opensim is that Jibe runs completely in a web browser.  This allows you to create examples like the HIV one above, where Jibe is living in a browser window while events in the Jibe world are communicating with and *changing* the web page content.  I love SecondLife and Opensim for what they do very well (e.g., realtime collaborative building).  But if you’re looking for complete integration with the web, then Jibe offers something unique.  It’s all about choosing the right tool for the right job. 

    More info about Jibe and web integration: http://becunningandfulloftricks.com/2011/04/01/why-a-virtual-world-on-a-webpage-is-awesome/ and http://becunningandfulloftricks.com/2011/06/29/contextually-relevant-coolness-how-to-make-jibe-automatically-load-new-content-on-your-webpage/

  3. One small correction.  My role at ReactionGrid is now Chief Learning Officer.  I was promoted a few weeks ago. 😉 http://about.me/pathfinder

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous Posts