VR and Latency: Carmack’s Thoughts

doom2-oinksThis post originally appeared over at our sister site Metaverse Health.

John Carmack is a bit of an icon in gaming circles, and he’s also one of the people that’s supporting the Oculus VR consumer headset that’s on the near horizon. I’d very stupidly assumed (having not read any biographical details on him until today) that he wasn’t that deep into the coding / science of things like this.

He’s just posted a nice piece of work on the challenges of latency in virtual reality. If you’re from a computer science background you’ll get a lot more out of it than I did, and even I could appreciate just how critical latency is in this sphere.

Latency is of course an important consideration anywhere but Carmack shows just how far we probably have to go to make VR headsets that give an accurate perception of real-time movement in physical space. It’ll happen of course – and I still want an Oculus now.

Daden Unveil Oopal

Oopal (pronounced oo-pull) is Daden’s latest offering: a web-based editor allowing you to place and edit objects in a 2D environment, which will then roll out to the 3D environment (currently OpenSim and Second Life with Unity3D support coming in the next 6 months). Watch this brief walkthrough video to check it out for yourself:

OOPAL Quick Introduction from DadenMedia on Vimeo.

The full press release from Daden:

Birmingham UK, 27th June 2012: Educators and trainers can now create engaging immersive learning exercises more easily and rapidly using an innovative web-based application called OOPAL, developed by learning and visualisation specialists Daden Limited.

OOPAL (Object Orientated Practice and Learning) lets educators and trainers with little technical knowledge use the web to build 3D sets from an existing library of objects, and create, edit and manage the scenarios and simulations entirely from the web. Only when they’re ready to deploy do they need to enter the 3D virtual world and “push the button” to materialise the sets and exercises ready for students to use. With OOPAL, educators – and even students – can create and maintain worth-while learning experiences without needing to be virtual world experts.

Daden have been creating immersive learning experiences since 2008. Built on the success of their award winning virtual learning authoring software PIVOTE, Daden’s second generation system, OOPAL, makes exercise creation and maintenance significantly simpler – making it easier to involve tutors and even students in the design and build process.

David Burden, Daden’s Managing Director says “We found that the easiest way to describe immersive learning experiences was in terms of a drama – thinking about actors and props, the script and their behaviours rather than abstract concepts like nodes and links – and we’ve designed OOPAL to reflect that – considerably easing the process from exercise design to implementation.”

A key feature of OOPAL is that it allows educators to lay out the 3D environment using a simple 2D “kitchen designer” type layout tool. Drawing from a library of props and virtual actors, educators can assign behaviours to each object – how they will react when touched, pushed, spoken to or approached. Dialogues can even be assigned to the virtual actors for use within the simulation. Users can build just a single room or even a whole environment. What’s more – once they have built their set and simulation they can create multiple copies in their virtual world – again at the touch of a button.

Fundamental to the use of OOPAL, within a professional learning environment, is its ability to log and time-stamp every student interaction within the exercise. This can be reviewed within OOPAL, or exported in whole or part to a VLE or LMS. OOPAL also supports scoring mechanisms for in-exercise feedback.

David says “One of the obstacles in the adoption of immersive environments for learning has been the need for educators to be experts – not in their field of study but in building within virtual worlds. OOPAL dramatically reduces that barrier and gives educators and trainers the tools to create real-world learning experiences for their learners in a 3D environment”.

OOPAL can be accessed as a cloud-hosted service from Daden, or installed on an organisations’ own servers. OOPAL currently enables exercises to be developed in both OpenSim and Second Life. Daden plan to release a version for the Unity3D, and a web/iPad player in the next six months.

So what do you think? My initial impression from watching the video is that it would simplify things to some extent though the technical knowledge required is perhaps still a little high for some people. Personally I’ll be really keen to see the Unity version to see what it brings to the fray.

Euclideon pops its head above the parapet

</aIn August last year I posted the last of a few articles on promising new graphics technology called Unlimited Detail. As I posted there, the team were going to ground to work on getting the technology to a stage where they have something even more substantive to show off.

That may be a little while off yet, but xbigygames.com has an interesting piece on how Euclideon are doing. A snippet:

As mentioned when Euclideon was first revealed, this technology is something they plan to utilise not only for video games but also scientific research. Supposedly there will be “some Euclideon products released in non-games related industries over the next few months”. “There turned out to be a lot of demand for our capabilities across quite a few industries, so we have tried to put that demand in order and address each area one at a time. As soon as we have revenue coming in, we can expand our team into different departments to deal with each industry,” Dell tells us.

“I think it’s fair to say that people are starting to accept that the future of 3D graphics is atomic,” he finally points out. “Polygons will still be around a bit longer as an editing tool, but I don’t know how much longer they will remain for visualisation. So many games today have polygons that are so small that they are only a few pixels in size. When polygons become smaller than the 3 corner points that make them, there is no point in treating them like triangles anymore and it makes sense to use atoms instead.”

On the question, when we will get our next look at Euclideon powered gaming, all Dell responds is, “Well there is soooooo much I’d love to say about that, but I’m afraid that I’m sworn to silence at this point in time. My apologies, but I think you’ll find it worth the wait.”

So things are still progressing and we should start to see some implementations of the tech before the end of the year by the sound of it.

Thanks to Phillip Street for the heads-up!

Federal Virtual Worlds moving beyond Second Life

Several years ago, the National Oceanic and Atmospheric Administration maintained more than a dozen virtual environments for online visitors to explore in Second Life. Now it operates just one.

For NOAA and other federal agencies, the focus of virtual world activity has moved beyond Second Life and diversified onto other platforms and gaming engines, according to Eric Hackathorn, a 3D Web designer for NOAA and one of the federal pioneers in virtual worlds.

“Virtual worlds are in need of some rebranding,” Hackathorn told Federal Computer Week. “Historically, virtual worlds were synonymous with Second Life, but that is no longer the case.”

While several agencies, including NOAA, NASA, Defense Department and National Library of Medicine maintain a presence on Second Life, several current initiatives have shifted to open source and in-house platforms and interagency efforts, he said. For example, DOD’s PTSD Experience invites users to learn about post-traumatic stress disorder.

“There is a lot of activity and many different use cases,” Hackathorn said, with initiatives for training, innovation and research in 3D and gaming environments.

The upcoming Federal Consortium for Virtual Worlds’ annual conference starting on May 16 will highlight some of those programs.
See on fcw.com

‘3D Virtual Campus Tours’ gains traction

I had a note from Andrew Hughes, Adjunct Instructor at the University of Cincinatti and head honcho of Designing Digitally Inc, on the success to date of their 3D Virtual Campus Tours product. Mirror worlds are of course well established and were one of the original ways universities and business have utilised virtual worlds.

Universities in particular are an obvious market, in that new students have a genuine interest in learning how to find their way around, to which virtual environments are ideally suited to help out.

I shot some questions back to Andrew Hughes to get some more information on where 3DVCT sees itself in the marketplace and where it sees its unique value is.

Q: What was the original impetus to develop specifically for campus tours?

A: We have built over 30 campuses inside Second Life, Opensim, and other virtual worlds only to find that we’re not thinking about the convenience factor for novice users. On the web we give around 2 seconds for a website to load before we move on. We were looking to build a browser-based campus and it just so happened that the United States Air Force Academy was looking for a virtual campus tour that was online and completely a browser based replica of their campus. We won the contract and have built a browser-based high end campus tour with built in communication tools and live and AI guided tours.

Q: What sort of response have you had to date from universities, including international universities?

A: We launched the product in March of this year. With the build we have done with the United States Air Force Academy, they have had four thousand recruits through the space at this current time. We have a handful of Universities both in the USA and outside the USA we’re building now but we are under NDA’s with them and cannot disclose their name, the nature of the campus’s needs etc until they are launched on the client’s website.

Q: We have a lot of readers who are educators: can you give a little insight on the platform 3DVCT is built on, including how easy it would be to implement at a university with more restricted IT infrastructure?

A: We’re using the Unity 3D gaming engine and a complex MMO system that is connected to a dynamic server or servers. The development of the system has a complete content management system for users, history, macros for the tours, and even the ability to control the AI bot and what she says within the CMS. The databases are able to fully integrate into an existing CRM or ERM software used by the university so that there is one streamlined process.

We work hard to learn the process from day one of a potential student to the date he or she signs up for the first class. We then build the system to be as integrated and as easy as possible for the University. We also have extensive experience in building in Unity 3D to the extent that we’ve been quoted by their CEO for our talents. The reason I state this as we can change the ports used to adhere to the client’s specifications. We also can cloud the system so that it loads faster and is a little less processor heavy on the end user.

Q: Obviously it will vary but can you give a ballpark cost for a standard university campus tour from development to implementation? And how do you think this compares to other options in the marketplace?

A: Our company is very good at what we do and so we’re in line with any other completely custom built browser-based virtual MMO developer. We also do pricing per enrollment size. So a smaller college will get a discounted rate depending on the pricing of the current student enrollment. Right now there is not a virtual world focused on just giving virtual campus tours. Right now in the industry other virtual campus tours are 360 panoramas or Google overhead maps. An experience like that won’t let the student see how big the dorm room is compared to his or her size, nor would it allow them to actually walk around a to-scale campus to see where everything is and get familiar with the campus by actually walking around in it or talking to a live admin rep through voice and text chat we have built in.

Our virtual space is built in the high end gaming engine called Unity 3D and has had over two years of R&D built into it, so that the process can be done quickly and at a professional level you cannot get from Second Life or any other virtual world out there.

Q: What arguments would you make for your platform as opposed to say a university going it alone and developing an OpenSim grid on which to mirror their university and conduct tours?

A: We have the ability to do the following, unlike the virtual worlds you speak of above:

1. Full AI Technology
2. Control over the avatar experience
3. Custom ability to change ports
4. Higher quality of development
5. Runs in a web browser
6. Does not have a large learning curve to get into the world
7. Fully customizable both interface, experience, branding, etc.
8. Ability to be skinned and placed on your website for full ownership
9. Full content management system for the ability to control bots, users, history user tracking where they were, etc.

This is far beyond what those other platforms could ever do – I state this as we’re well known for our SL and Opensim builds and we found that we cannot recruit students effectively.

Q: What are Designing Digitally’s plans for the coming year?

We are working on 3D training simulations, and virtual worlds for government and corporate clients. Many of them are either under NDA or classified government projects. We will be launching a financial literacy system for people to learn how to manage money, buying houses, etc. This will include both Flash and Unity simulations within it. We are also going to be going to the following conferences:

– ASTD 2012
– SALT 2012

3DVCT will be at:
– Noel-Levitz 2012
– NACAC 2012

———-

So there you have it – the 3DVCT product has hitched its wagon firmly to the Unity3D platform, an obvious trend in the simulation field in particular. For what it’s worth, the time I spent checking out 3DVCT further reinforced to me the responsiveness of Unity3D. It’s not the panacea for everything but it’s dominating some key virtual world niches – which lays down a significant challenge to competitors. That can only be good for the ongoing evolution of the industry.

On the fly 3D surface reconstruction: KinectFusion

Microsoft’s Kinect is rightfully getting a lot of attention from researchers. One snippet that caught my attention is a collaboration between Microsoft and a number of UK and Canada-based researchers. The result is KinectFusion.

Have a look for yourself:

The implications for virtual worlds are fairly obvious. The thing that particularly struck me is the dynamic capability of the approach even at this early stage – if something changes with the physical world environment, it is reflected virtually. For the education, science and health fields, to name three, this is huge.

One obvious example within my pet area of clinical simulation: a camera (with consent) is placed in a busy emergency department in a large teaching hospital. Emergency nursing students based at a rural university receive that feed, had it convert on the fly to 3D for use within their virtual learning environment. Students may actually ‘work’ a full shift virtually, needing to respond to the challenges of the changing environment as they occur.

As I said, there’s a long way to go (for starters, KinectFusion is about surfaces only), but the progress is rapid and exciting. Over to you: what applications could you see this being good for?

Euclideon’s Unlimited Detail: a hands-on

Bruce is the better looking guy on the left

In recent days I wrote about the latest video released by Australian developers Euclideon, who are behind the ‘Unlimited Detail’ engine. In that article I claimed the video was a pretty effective rebuttal of some of the criticism / cynicism amongst the gamer community in particular.

Thanks to a convergence of schedules and geographies, I actually had the opportunity to have a hands-on with the engine myself on Friday night. CEO Bruce Dell, having just gotten off a plane from the UK, spent some time talking about his recent trip to Gamescom in Germany, the work he has on his plate and the level of interest the engine is receiving. Then it was onto some ‘play’ time. After 10 minutes or so of navigating the demo (the same one shown in the video), a few things struck me:

1. The absolute smoothness of the navigation experience

2. The fidelity of the graphical experience

3. It was all done on a bog standard PC laptop

4. If the same level of quality and smoothness continues after full animation capability is integrated, that this is going to be one groundbreaking piece of technology.

5. If good consumer content creation tools are integrated with the engine, current virtual environments such as OpenSim and Second Life should be very, very concerned. Or at least be looking at licensing the technology.

I for one am excited to see what comes out the other end of Euclideon’s self-imposed media blackout over the coming months. As I said to Bruce on the way out from our meeting: he should make the best of the time out of the spotlight, because if he pulls off what he’s aiming for, it will be the last time he’ll have that luxury.

Photo courtesy of Phil Testa.

Kitely: open source virtual worlds simplified

Like a lot of virtual worlds observers, I’ve written repeatedly on the need for virtual worlds like OpenSim and Second Life to be simpler to use – ideally web browser based. Kitely, a project underway since 2008, takes a big step toward achieving that by making the establishment of an OpenSim grid nearly as simple as it gets.

It took me under ten minutes to get set up in Kitely. Here’s how:

1. Log in via Facebook Connect.

2. Install the Kitely plugin (Mac users note: Safari or Opera aren’t currently supported by the Kitely plugin at present, you’ll need to use Firefox or Chrome).

3. Create a world and choose if you want to invite anyone from Facebook groups you are part of:

4. Type a name, optional description and type of world you want to start with.

5. Click on ‘Enter World’ and your SL browser will launch (Mac users again – there’s a known bug whereby your username and password are all entered in the Name field of the SL browser – you just need to type in that password and delete it from the end of your name)

5. Voila – you’re now on your own island / collaborative space:

6. Three minutes later and I had my venerable log cabin rezzed on my island:

Kitely is currently in beta, and the currency used is called a KC. As part of the beta you get 50 KCs currency to start with and it costs 1KC per day to keep each world you create. On the proposed maximum discount structure that works out at US ten cents per month. It’s an attractive proposition for someone not wanting the hassle of creating their own grid from scratch and its more than competitive with other providers. The support functionality is fairly well set up and responsive from what I’m seeing.

There’s still plenty of kinks to iron out but Kitely is a superb snapshot of what is going to be required for wider adoption of virtual environments: simplicity and integration with other platforms. I’d be interested to hear from anyone who has experience in creating content in OpenSim as to your thoughts on comparability to other offerings. It’s also worth having a read through the Kitely FAQ, which covers a lot of stuff including the approach to intellectual property (essentially the same as Second Life) and the Terms of Service.

Thanks to a number of Metaverse Journal twitter followers for the heads-up.

Update: With thanks to reader Psx_kai, who pointed out a key fact I’d missed in the story. The pricing I described was correct but didn’t include the extra charge of US$0.20 per minute for each visitor to your world. That’s certainly going to get pricey after a while although it seems it’s going to be an option to earn a fair whack of free KCs. On the upside, the ‘pay for what you use’ model is something that can work well for those wanting intermittent events without the ongoing higher monthky costs on say Second Life.

The Virtually Live Events Project

This is a guest post from Surreal Numbers on how the Openlife grid has played host to numerous musical events. It’s one of many examples of how OpenSim and related grids are continuing to grow in popularity and maturity.

Thanks to Shai Khalifa for the heads-up on the project initially, and for a historical take on Openlife you can also view our original 2008 profile of the Openlife grid here.

==========

I’d like to thank Lowell Cremorne and The Metaverse Journal for taking an interest in the Virtually Live Events Project and publishing this article.

Purpose and Initiation

User-created virtual reality is the most flexible and powerful tool for sharing information and imagination. Upon entering a space made with 3D modelling primitives and scripts, a visitor can take a journey far richer than that offered by other sharing technologies including blogs, photographs, and audio-video streams. At the same time, community development using virtual reality can have a significantly different life-cycle as compared to the use of other social media.

Social development and technical capabilities are closely entwined in virtual worlds. For example, without a functioning script engine, reliable login servers, and robust sim physics, it is not reasonable to plan and host events. By late-2009, The Openlife Grid had reached levels of stability, scalability, and security appropriate for hosting events reliably. The Virtually Live Events Project (VLE) was started on 05 September 2009 with a solicitation for an “International Live Music Events Developer” committed to consistency, innovation, excellence, and sustainability. Goals were set for number of monthly events, expected audience size, an international distribution of performers representing all continents, and integration with the grid’s business community. The quantity and quality of the responses were overwhelming. They were invariably professional and, most strikingly, reflected a strong spirit of generosity.

The solicitation was focused on finding a single person capable of initiating and maintaining the project so I had not anticipated that the majority of feedback suggested that I form a project team and manage it. As I came to learn, a team was easily justified by the extensive list of tasks that needed to be addressed. But I was reluctant to manage. I’m a mathematical scientist, not a musician, and I felt unqualified to understand music event hosting much less how to build a sustainable arts program.

Eventually, I recognized two things. First, I have a strong interest in hearing music from everywhere. My father had been a Grammy-nominated recording engineer for RCA Records and worked with outstanding musicians and singers from around the world. In addition, I’m a product of the South Bronx, which has a rich fusion of multinational music that simply will not allow one’s body and mind to sit still. Second, professionally, I have a lot of experience planning and hosting conferences as well as managing international research project teams. It seems to be an odd combination of characteristics on which to base the decision to manage but now, a year and a half into VLE, they seem even more applicable.

Challenges and Team

The challenges for the project are to:
1. host music events consistently;
2. innovate to keep performances and venues fresh;
3. work towards a standard of excellence; and
4. sustain and grow the project into the future.

To meet these challenges the team, which has evolved over eighteen months, was initially Debbie Trilling, Adec Alexandria, Shai Khalifa, Digital Dreambuilder (Digi), Pantaiputih Korobase (Pants), and me. At present, it includes Shai, Digi, Cheops Forlife, and me as well as a consulting group with Caro Axelbrad, Grimley Graves, and Pants.

Debbie Trilling and Adec Alexandria (both UK) helped establish a strong footing for the project as well as provided me with the best possible mentoring for managing it. Debbie’s artistic work is well-known in Second Life. She sets a very high standard for quality and was always quick to point out what would not work, what would work, and why. Adec has experience hosting events, is a keen photographer with a great design sense, and an excellent builder who can quickly bring prim form to the vision in his mind’s eye. Debbie and Adec eventually resigned because of other personal and professional obligations but their influences still underpin Virtually Live Events.

Shai Khalifa (Australia) has a degree in arts management, was a professional musician, and has extensive experience managing virtual music events. She has been invaluable for vetting, contacting, and booking performers. Her role is especially challenging since she is literally the artist’s first contact with VLE and she has the professionalism to address whatever questions, comments, or observations arise. In addition, her experiences have provided sound insight into how the performance program should be structured and how it will evolve.

Digital Dreambuilder (Native of Ireland living in Finland) is innovative, a skilled builder and scripter, and an amateur musician with experience planning and hosting virtual events. He’s also professionally involved with virtual education and training, which has implications for the future of VLE. He has a well-grounded sense of setting goals, the capabilities for meeting them, insights for avoiding pitfalls, and the creativity for crafting fallback plans in the case of disaster. He’s built and animated almost all of the exceptionally detailed musical instruments used by performers on Virtually Live.

Pantaiputih Korobase (Germany) was an early member of the project team selected for his insightful nature, exceptionally big heart, people skills, and diamond in the rough building skills (nowadays, he’s well-cut and polished). His role has been recast as a consultant in order to accommodate his personal wishes.

Cheops Forlife (France) was added to the team after Debbie and Adec left. She is unfailingly cheerful, positive, and creative. Once new performers are booked, she brings them inworld and prepares their avatars for the performances. This is no small task since there are psychological, sociological, and technical factors involved. However, she is exceptionally well-suited to the effort given her training in psychology and professional background managing non-profit programs.

Caro Axelbrad (Spain) and Grimley Graves (US) serve as consultants to VLE. Caro custom builds skins, shapes, hair, and clothing for the performers when needed. Both she and Grims have been longtime supporters of the project and, along with Pants, share their creative ideas for helping VLE grow and evolve over time.

I help the team meet the project challenges. I especially enjoy designing and building our default and themed venues.

Although not a member of the project team or consulting group, Logger Sewell deserves recognition for donating the stream used by Virtually Live. His action was an early example of the generosity the project enjoys.

Performances and Venues

VLE performances and venues have been well-documented on the project blog as well as Twitter where performers are announced and event photos are posted. Over the last eighteen months, VLE has held themed events (seasonal parties, wear your green dots, pool, beach, and valentine’s aftermath, among others) and rebuilt the project region, Virtually Live, many times to accommodate the themes as well as new concepts for the entire venue.

Performers have responded enthusiastically. It is really important to the VLE team that the performers have the best possible experience whenever they visit Openlife and this is reflected in their feedback both to the team and the event guests. Time and again, performers have commented on how much they enjoyed the entire process of coming inworld and performing. While musicians and singers had previously crossed from one grid to another to perform, the VLE project broke new ground by establishing an innovative mentoring model to make their transition to Openlife simple and fun. One broad reaching effect of the VLE model is that it has provided a methodology that performers use to explore the potential of other virtual worlds, which increases their reach and audience base.

The performers, moreover, invariably notice both the unusual artistic venues on Virtually Live and the chatty appreciative crowds that attend. The Virtually Live region is devoted exclusively to the arts and the builds are among the most distinctive and beautiful performance venues in any virtual world. In turn, performers all want to come back and have spread the word to other performers throughout the metaverse, who have either already performed on Virtually Live or will be booked in the future.

The most important thing the team wanted for the audience was simply a relaxing fun time that everyone could count on happening regularly. Again, the response has been overwhelmingly posiitive and the best part of this has been the social development. Friending occurs frequently during each event and connections are made or strengthened. New residents of the grid are treated to a warm and helpful greeting in an enjoyable atmosphere, which reinforces the reputation of the community.

In addition to the many blog photos from events, Caro and Pants have each made videos of some VLE performances; a few links are:

Openlife 3rd Birthday and Halloween Party by Caro

Idella Quandry “Fields of Gold” by Caro

Yellow Pool Party by Pants

The Future

Recent updates to Openlife’s infrastructure have brought further improvements in scalability and features, which, in turn, can be used to enhance and expand the social environment of the grid. For example, for several events, VLE has made use of scene-flip, a unique to Openlife feature that, with the touch of a button, flips the entire region to an alternate scene. Having a full-sized blank region with 45,000 prims to use for a themed event has led to some beautiful scenes that required a lot of people to build. The Openlife 3rd Birthday and Halloween Party video illustrates the use of flip-scene. While the default venue remained secure in Slot 1 of Virtually Live, an entirely different scene was built for the party in Slot 2, including themed terraform and builds.

Virtually Live Events, notably, has been a valuable source of performance and stability data for Sakai Openlife, 3DX Openlife founder and owner. These data have been helpful for identifying technical problems and solutions that lead to grid improvements enjoyed by all residents.

While the emphasis to date has been on music, VLE intends to increase its scope to include theatre, art exhibitions, and arts education venues. Virtually Live Events wants to thank all performers and guests for their support over the last eighteen months and looks forward to where technical capabilities and social development take the project in the future.

Blue Mars on iPhone: testing underway

A brief post from Australian fashionista and Blue Mars insider Estelle Parnall, has a shot of some testing she’s done of the iPhone version.

The shots (pictured left) aren’t really different to the ones released when the announcement of the iPhone version, PC version halt and staff layoffs were announced a fortnight ago.

That said, those limited glimpses do show promise, it’ll be interesting to see the progress made in coming months.

Disclosure: Estelle advertises with the Metaverse Journal.

Previous Posts