Skip to main content

Sensor Emitter

A couple of weeks ago we explored some of the common sensors that are found in our smartphones. They are used practically every time that we pick up our phones or turn them on, but how much do we really know about what they do and how they work?

Sensor Emitter

One young man who knows is Philip Daubmeier, 27, of Ingolstadt, Germany, who has just finished his master’s degree in computer science. 

For his master’s thesis, Philip needed a tool to evaluate the sensor measurements in a smartphone and as a result of this he created the Sensor Emitter app, which lets you interact and learn about the sensors on your Nokia Lumia.

We spoke to Philip, who is now an automotive software engineer, to shed some light on the sensors within a smartphone, why they are so important and which apps make the best use of them.

What got you interested in developing apps for Windows Phone?

I started developing apps when the first developer tools for Windows Phone 7 were released.

I totally fell for this platform, mainly because of its clean and modern user interface, but also because of the technology underneath.

Still, it took quite some months until I published my first app in the store.

What was your first app?

It is called ‘Technicorder’ and allows you to set timers in your digital video recorder from anywhere with a Windows Phone device. It is designed to work with Technisat German TV boxes – they have many models with a built-in hard drive and a network interface. 

I wanted something like that for myself, so I would never miss any of my favourite shows when I would come home late.

What inspired the Sensor Emitter app?

I needed a tool to evaluate sensor measurements for my master’s thesis. The main goal of the thesis was to estimate some key parameters of a person’s body: to find out if someone is standing or sitting or where they are looking, all from the smartphone that is carried in their pocket.

To achieve this, I combined the smartphone sensor data that was streamed live to a PC with the 3D image of a Kinect. The smartphone knew how it was located inside the pocket, while the Kinect could see the actual body of a test person. Put these two things with a machine-learning algorithm, do some fine-tuning magic and you get really good results! I published the source code of the final machine-learning program for my thesis here.

A lot of the information in Sensor Emitter is very technical. How difficult was it to bring all this together?

I had to do tons of research for my master’s thesis and was fascinated by the level of complexity that is needed to get all those different inertial sensors to work. It is incredible how many components of hardware and software have to work together to measure, filter, transform and combine all the raw data into something useful. 

They are inside every smartphone but hardly anyone knows how they work.

I realised that I could share my work by taking the already existing tool that I needed to send those sensor values to the PC, and complete the app by adding explanations of all sensors to it.

I also published the programming interface and a code snippet for a server component to allow everyone to create their own computer software that uses the phone’s sensors. I recently got an email from a physics teacher, who told me he built a small programme that displayed the acceleration values and their integrated speed values in diagrams. This way, his students could experience basic laws of physics hands-on.

How important are the sensors in a smartphone? What are the sorts of things that we couldn’t do without the sensors?

Sensors are those pieces of hardware that let the software access the real world around it.

Therefore they allow phones to become aware of its environment and enables human-machine interaction.

The cool thing is that sensors can be used for totally different purposes, depending on the software that interprets it. Originally introduced to rotate the display when the phone was also being turned, the accelerometer is now also used to control games, monitor sleep cycles, infer your body posture and many more.

The possibilities are nearly endless, especially if many sensors are combined to get a more meaningful and complete picture of the environment around it

Smartphones and other mobile devices will keep getting more sensors – and more accurate ones – so we are looking at an exciting future.

In your opinion, what are some of the best apps that make use of sensors?

The sleep cycle monitoring apps to wake you up in a light sleep phase is a great idea. It is using the fact that we make different movements in different sleep phases and senses those via inertial sensors.

Also, those metal detector apps, which use the compass/magnetometer to detect nearby ferromagnetic metals is a cool gimmick.

I’ve picked those examples because they use the sensors in a way that nobody thought of at the time they were originally built into the phone.

This is also why I think this is only the tip of the iceberg and there are many clever ideas and much more sophisticated apps still to come.