I am running very short on time as I will be leaving the country for a few days soon. It is due to this that I took it upon myself to look into research for the second mini game. This minigame will be about the use of Infrared in space exploration. This is not a complete literature review, as my role is to focus on the design/implementation of the minigames. But it should be enough for me to make a start on the design and implementation on my second Infrared based minigame.
From what I've gathered on the Internet and by talking with Dr.Claus in one of our meetings, Infrared is used to explore and see things in space that would otherwise not be viewable. Or as easily viewable, due to interferences. It is a form of electromagnetic radiation. An example of this can be seen in the image here.
I started by looking at the research links Begoña posted regarding Beagle 2, after Dr.Claus mentioned to us that it might be worth while to look into it. I looked through a few of her links here (http://astronauticssimulation.blogspot.co.uk/2014/12/more-about-dr-malcolm-claus-meeting.html).
I stumbled across the Infrared Mineralogical Mapping Spectrometer (OMEGA) experiment documentation after viewing the NASA Beagle 2 experiments page. I spent an hour reading up on the documentation here and around the topic of spectrometers but this information was very technical and way over my head. This specific entry was also just a proposal and no actual data was collected. I also realized I needed something simpler and more eye catching for my mini game. But the mention of spectrometers gave me a starting point.
In our final team meeting we proposed to look into an "explore the galaxy" option. As it may break up the constant 'staring at planets and moons' theme. Rather than focus on one single object in space. This minigame would be about a few different ones viewed from afar. I looked a bit deeper into this with spectronomy, using youtube and I found this video here:
https://www.youtube.com/watch?v=faW_G3ctB8A - Exploring the Infrared Universe. This was a very useful video clip and it introduced me to Infrared exploration in a way that I understood. In the video they make mention of the, amongst other things, Near / Mid and Far infrared wavelengths and what each is used for.
The video goes further to make mention of the different space telescopes, IRAS / ISO / Akari and Spitzer. This information was just what I needed. I summarized the juicy bits below:
Near / mid / far infrared
As an object cools it will transmit its radiation at progressively longer wavelengths and therefore further into the infrared.
Near infrared = radiation wavelengths that are longer than those in the visible spectrum (what we can see normally). Cooler red stars will become more apparent and interstellar dust will become transparent when viewed with near infrared wavelengths.
Mid infrared = the cool interstellar dust itself starts to shine. Interstellar dust can often be found around celestial objects such as red stars.
Far infrared = transmitted by very cold objects. Using this wavelength astronomers can observe the cold radiation of protostars. And stare into the center of galaxies, including the Milky Way. This allows us to make observations on objects very, very far away as we can bypass the plethora of noisy data that may otherwise appear.
Observing the Infrared spectrum from Earth is difficult because molecules in the atmosphere interfere with the observation. Additionally the Earths own infrared radiation interferes with observation. It is due to this and the best way to measure the infrared spectrum is from in space.
Spacecraft & Instruments
The video went on to explain the different space craft used in Infrared imaging, briefly. Again a summary of what was said can be found below.
IRAS - The first ever space based observatory for Infrared wavelength measurement.
ISO - Detected water in the universe. In the atmosphere around planets in our galaxy.
- Dust and gas that fill the space between stars is called the interstellar medium. ISO found a carbon rich material called polycyclic aromatic hydrocarbon. In space this materials presence is a strong advocate to organic chemistry and can be used in research for life on other planets.
- Discovered that there was a peak of star formation about 3 billion years ago. This discovery was achieved as ISO was able to use the infrared spectrum to view past the interferences that normally surrounded galaxies.
- Andromeda (our neighbor Galaxy) is considered to be a typical spiral galaxy. However ISO discovered that it was made up of several concentric ring, of a very cold dust around 13 Kalvin. Far to cold to be viewable on the visual wavelengths.
- In a nearby galaxy fast moving streams of plasma was observed being released from the center of the galaxy, but until the introduction of ISO, we were unable to view through all the gas and dust to see into the center. ISO revealed, using Infrared wavelength that the central object in this galaxy was a black hole.
Akari - Japanese infrared astronomy satellite, not much information on this.
Spitzer - NASA's infrared space observatory launched in 2003
Herschel Space Observatory
After those the ESA's Herschel space observatory was build. The worlds largest space telescope. It allowed unparallelled exploration capabilities and allows us to probe space in much more detail than we were before, using the Infrared spectrum. Herchel consists of the following parts, which make up its payload.
PACS - Photo conductor array camera and spectrometer - Can study young galaxies and star forming nebula. It is the first spectrometer capable of obtaining the complete image of an object at once.
SPIRE - Spectral and photometric imagine receiver - Designed to exploit wavelengths that have never been studied before. Can be used to study the history of star formation in the universe
HIFI - Heterodine instrument for the far infrared - High res spectrometer also designed to observe unexploited wavelengths. It is able to identify individual molecular species. Used to study galaxy development and star formation.
The remaining parts of the payload consists of the shielding and cooling systems. All of these are found underneath the huge primary mirror (the largest of its kind in space).
After watching the aforementioned video I started looking at Herschel in itself and found the following youtube videos very interesting:
The lecture videos were very interesting and gave some very good photographic examples of star formations using Herschel. Allowing users to see otherwise unviewable 'extra galactic background'. This was achieved by using the SPIRE camera mentioned earlier.
They also made mention that the HIFI system was used to obtain 'the most complete spectrum of molecular gas at high spectral resolution ever'.
Again this is where graphs, unknown calculations and terminology started to come in but it was still useful to get an overview of just how the Herschel space observatory and more importantly Infrared wavelength measurements were being used in a practical way.
They made mention of a Herschel Atlas program. Which was the biggest undertaking for space-area measurement. This program might be worth looking into if I need some new ideas for minigames. At this moment in time however, the team has agreed that I will be doing two minigames only and AK the other two.
Rough Concepts based off research
Thus far I have come up with two core concepts for my infrared mini game based on the research above:
Star Formation using Infrared - This could be something simple like the connect the dots game in Dragon Age or what the mastery's mechanic looks like in Skyrim.
Difference Wavelengths - Using Different type of Wavelength to see and identify different things in a nebula (Short/Mid/Far Infrared Wavelengths)
Either of these will work. I think I have enough research about infrared for now. My next objective will be to look into how to make a nice looking nebula skybox to use. Regardless of which option I pick from the above - I will need to have this made and it will need to look as amazing as possible. I might even need three versions (one for each potential wavelength).
I'm taking a moment away from my GUI shenanigans to write a small update.
In short, I might have jumped the gun with regards to my rollback. A few weeks ago I spent some time looking at how to use the new Unity 4.6 GUI with Oculus rift. I kept coming back to this topic here.
Without the hardware to test this out, I grew increasingly concerned that I would develop the simulation on a version of the engine that was not fully supported yet. From what I gathered some people were having trouble seeing their applications output to the actual Rift device.
What I should have done was investigate this issue in more depth but as I was so distracted with the development (and a bit of team management) I didn't do this.
From the very beginning of the project, I have had to keep an eye on monthly (even weekly) updates from Oculus and Unity as they are very active on the development of their products.
For example when our module started Unity Free Oculus support didn't exist. A month later it was up and running and being used by many people, and updates have been whizzing by.
It is with this mentality in mind that I thought I would rollback to build 4.5.5. My thoughts about this was that once 4.6 was fully ready I would be able to update my project to 4.6 pretty quickly and just create a new UI.
However if I created the simulation on 4.6 and it turned out to not work properly on the Oculus Rift, I would not be able to quickly rollback to 4.5. In fact it took me a full day to get my 4.6 content running on 4.5 as I had to manually import all assets and re-add them to scenes.
Without the hardware I was running on assumptions. That is until I read deeper into things. It seems from what I can tell the user in the above post was using the Direct to Drive option on Oculus Rift rather than the extended mode. This is still an issue with the DK2 when developing and is a minor annoyance rather than a deal breaker.
On the 23rd of December the following post was made on the Unity blog. Stating "The Unity Free integration for Oculus
gives you access to the exact same Oculus features as users of Unity
Pro. You can use Unity 4.6 and the Oculus integration package to deploy
any sort of VR content imaginable to the Rift!"
So what does this mean? Basically I am now rolling all the work I did today in 4.5 to 4.6 and then I will be merging it with my previous work I had already done in 4.6. A bit of time wasted, but it's my own fault for not spending the time reading up on things more.
At least now I can get a UI up and running on the Rift. See my previous post titled "Minigame work cont. - Looking into new Unity 4.6 UI with the Oculus Rift" for more on this.
Currently refreshing myself on system design I stumbled across an interesting site with a bunch of research articles. They appear to cover many different topics including Games Development, Mobile Development, Programming etc.
I spent some time last night and the day before porting out the first Minigame to build a demo for Dr.Claus. I ran into a problem when creating a UI for my game. Without the Oculus Rift creating a UI was trivial.
A standard canvas worked fine and elements resized themselves nicley along the anchor points implemented in 4.6 of Unity. However when using the Oculus the UI would not show up at all.
I have been googling around as time permits to look at how others have gotten around this. One solution was to roll back to Unity 4.5.5#. This would solve the problem but means no fancy new UI features and probably a lot of annoying UI code.
The other solution I found on Youtube was to use a Canvas but setting it to world space. I tried this out and it works a treat.
However, I saw a post on Unity Answers mentioning that on the Oculus forums they said Unity 4.6 is not supported on Unity 4.6. And that people were having trouble seeing anything on the actual Rift hardware when using Unity 4.6. Additionally there have been reports of Input issues when using 4.6 with the Oculus.
I posted a request on the Team Blog asking any team members to try and snag us a Rift so that I can run some tests. I am unfortunately unable to get a Rift out myself as every time I have tried I am at work whenever a Rift is available. And I have been trying to organise this process via e-mail which is very difficult even with the Tech staff being very supportive.
If this is the case, then we would have no choice but to roll back to the supported 4.5.5# build. This means more implementation time for UI elements. I'll keep working on this today, but my current focus is getting a demo to Dr.Claus.
Need to rethink this, when playing on the rift this might not be a good feature.
Taken from the Oculus Best Practises documentation - Page 2-3, 'Avoid visuals that upset the user’s sense of stability in their environment. Rotating or moving the horizon line or other large components of the user’s environment in conflict with the user’s real-world self-motion (or lack thereof) can be discomforting.'
The display should respond to the user’s movements at all times, without exception. Even in menus, when the game is paused, or during cutscenes, users should be able to look around.
Use the SDK’s position tracking and head model to ensure the virtual cameras rotate and move in a manner consistent with head and body movements; discrepancies are discomforting.
Create immersing UI
Work in progress
Keep in mind the following from the Oculus Best Practises documentation 'Maintain VR immersion from start to finish – don’t affix an image in front of the user (such as a full-field splash screen that does not respond to head movements), as this can be disorienting.'
Add timer until victory - Added, need to hook up to code
Still the case
Add defeat once rotation meets X position
Replace placeholder assets
Integrate and test on Occulus Rift
Got this up and working for 4.6. Can make this work in 4.5.5#. Need to do more research.
Left on IR minigame:
I also started looking into Infrared for use in space exploration using the links on our group blog that Begoña posted. I have a few ideas knocking around that I hope to work on. I need to find a fun way to implement this idea rather than have a simple "press B to scan" game.