In the first of a two-part article we will discover how IBM technologies are helping us to explore higher, wider and deeper than ever before, pushing technology to the limits of our very existence.
At the IBM Innovation Centre in Hursley, we have the skills, knowledge, expertise and resource to help bring your innovations to life. All you need is an awesome idea and we can help you do the rest. So when eight students from the Imperial College of London got in touch to say they wanted to visit space, how could we refuse?
Code name ‘Project Edge’ the team’s mission was to design a proof-of-concept in just four weeks that could help prepare first responders for hazardous conditions by providing them with an intelligent environment where they could safely acclimatise to the surroundings they would be exposed to. They decided on an environmental probe that could measure temperature, light and movement, which could then beam that data back to Hursley and recreate the emergency conditions in real-time within the IBM Innovation Centre’s ‘Internet of Things Lab’. To test our proof-of-concept, we sent our probe to one of the harshest environments anywhere on, or above the Earth - space.
Leveraging the IBM Innovation Centre
By visiting us at the IBM Innovation Centre in Hursley, the team from Imperial College was able to collaborate with inventors and subject matter experts to design and develop the final prototype.
Before we began work on the probe, we discussed the methodologies we would use to help design, architect and implement the project. One of the more thought provoking elements of the project was the use of ‘microservices’. Microservices rely on breaking a larger project up into smaller, discrete portions that allow programmers to fold the possibility of failure into their design and allow the system to continue, even if a particular service should fail.
With such a short time frame, the team was drawn to the capabilities of Bluemix to develop the code that ran the probe. Using a cloud-based development platform meant that the team did not have to concern themselves with installation or configuration of software, they could begin working immediately. The value of microservices became apparent, when we provided a live demonstration of Bluemix, and Node-RED, and the team was able to assemble a rudimentary, microservices-based model in just a few minutes. The final part of the tour involved exploring a range of sensors that the team could incorporate into the probe to feed data back to Hursley, which could then be processed by Bluemix in the Command Centre back on Earth.
Using cognitive insights
By incorporating IBM Watson’s image recognition technology and Bluemix cloud capabilities with low-cost, durable electronics, including a Raspberry Pi, the team managed to build a ground-breaking cognitive probe. The probe was designed to routinely collect atmospheric readings, including:
- Altitude (in metres)
- Pressure (in pascals)
- Temperature (in degrees celsius and fahrenheit)
- Acceleration and rotation
- Luminosity (in lux)
- Magnetic flux
In addition to these readings, the probe can also record its GPS location, as well collecting images at various points along the flight path using Raspberry Pi camera modules. All this data can then be sent to the Command Centre to be analysed and interpreted. In a real-world scenario, as well as providing data to the intelligent environment, this information could also be passed on to warn the responsible authorities and first-responders in cases of emergency, such as forest fires or flooding.
In addition to reporting on the immediate situation, the probe is also able to perform sentiment analysis, pulling in social feeds on Twitter to draw deeper insights into the nature of an unfolding event. Unfortunately, due to the sub-zero temperatures, the display created to show the Tweets, froze over, however the probe was still able to analyse the sentiment of the Tweets it received. This type of feed could be used to analyse social media channels to help determine whether the effects of an incident is spreading, so the emergency services can deploy resources accordingly.
Launch commencing in 5...4...3...2...1
The team launched its cognitive probe via a helium balloon, which eventually went on to reach heights of 25km above Earth. On its journey the probe was able to monitor and describe its environment with a level of confidence. But while it was incredible to see the cognitive probe identify its surroundings, what was even more fascinating was seeing how the cognitive probe coped with the more unusual visual input, when relying on its default training. For example, upon launch all the students were moving rapidly around a sports field, performing their hurried final preparations for launch and the probe interpreted this as people playing sport. Given it's rare for a group of people be launching a space probe, it's a pretty good interpretation of the situation and a reasonable assumption most would make.
Already we’ve begun to build on what we have learned with the cognitive probe, and are now working with Imperial College and IBM partners to develop a cognitive rescue drone. This cognitive drone could identify people in distress, and relay data on temperature and air quality, as well as perform sentiment analysis on social media feeds around the area of an emergency, to determine if the effects of the emergency are spreading.
In part two of this article we will look at real-life applications for the space probe.