Weekly Update 4-20-16 to 4-26-16

This week was spent on trying to rotate our data to be properly oriented with respect to the shark. This means that if the sensor is attached such that there is an angle and gravity is being read on all three channels of the accelerometer, we rotate the data so all of the acceleration due to gravity is in the z axis. To do this takes a couple of steps. The first is to find the acceleration from gravity(or more accurately the normal force) in each of the three axis. To do this we simply do the opposite of what we did last week to remove gravity, we pass the data through a low pass filter so only the acceleration due to gravity remains. Alternatively we can pass the data through the high pass filter and then look at the difference between the filtered data and the unfiltered data. Once the acceleration of gravity in each direction is found a gravity vector can be found by using geometry. We then want to rotate the rest of our data such that this gravity vector becomes the z axis. We did this by deriving a matrix or transformation. Rotating the data in this way can be thought of as two distinct transformations. I decided to do my transformation once in the xy plane and then the xz plane. i wanted to do this as two separate transformations in 2 dimensional planes because it’s easier to determine matrices of transformation in two dimensions. Once the matrices for each transformation were derived I then multiplied them together to get a transformation matrix for three dimensions. Then finally to check my work I made sure that all three of the new axis were perpendicular to each other by taking the dot product of them and confirming that it was equal to zero.

These matrices took a long time for me to come up with. Most of the time was spent drawing the same couple figures on a sheet of paper and trying really hard to visualise the moving the axis of a three dimensional cartesian coordinate plane. I found this to be surprising conceptually difficult but I eventually figured it out maybe. In fact as of the time of writing I don’t really know if my matrix of transformation is correct. It certainly changes the coordinated but I’m not sure that I found the angles I need correctly and that the coordinate system is in face oriented in the way I want. I am thinking about good ways to test this and am going to continue working on this change of coordinates in the coming week.

Finally I am also looking into something that will make the change of coordinates a lot easier, and that is adding a gyroscope to the system. This will add a lot power consumption but will take the guesswork out of finding the orientation of the device. The current strategy of find gravity and angles by using filters is very inaccurate and error prone, this is especially true if there is any rotation in the system. The gyroscope will allow us to look at the movement of the animal with 6 degrees of freedom and lead to much more accurate data.

Weekly Update 4-13-16 to 4-19-16

I spent this week looking at ways to get the acceleration due to gravity out of our data. As I discussed last week the accerometer does not just pick up acceration from moving it, it also picks up acceration due to gravity, or more prescily acceration due to forces conteracting gravity such as the normal force. At first it may seem trivial to remove this acceleration from the data, since gravity always accelerates things down at 9.8m/s2 we can just subtract that value out from our z axis data. While the part about acceleration due to gravity being a constant 9.8m/s2 down is mostly true (it changes a little based on elevation and latitude), we can not just subtract this value from our z axis values because it assumes that the accelerometer is perfectly flat and level. This is not a good assumptions, and it means we need to find a more sophisticated way to remove the acceleration due to gravity, keeping it mind this acceleration can affect all three axis by unknow amounts since the exact angle off the accelerometer is unknown.

A popular way around this problem is to use a gyroscope with the accelerometer to allow you to find the angle of the device and thus use geometry to find the gravity vector components and remove them. We don’t have gyroscope so we can not use this method. The method I am trying right now is less precise but is currently showing promise and that is to filter out the acceleration due to gravity. The idea is that since the acceleration due to gravity is very constant we can put the data through a high pass filter to remove the low frequency acceleration due to gravity. This filter will hopefully pass all of the non gravity acceleration because it should all happen at a much higher frequency than gravity.

With this plan I researched the types of filters other people had used for this problem. I found a suggestion to use a fourth order Butterworth filter with a frequency cut off between 0.1Hz and 0.5Hz. An implementation of this filter comes in the MATLAB signal processing toolbox so I set up a MATLAB script to use it and played around with parameters. I used a small data set in which I moved the accelerometer back and forth in the y direction, then the x, and finally the z I found that this filter has a tendency to invert the data put into this, and also it seemed leave some amount of acceleration due to gravity in. Additionally at the beginning of the data set it introduces something that looks like a roll off. I do not entirely understand why the filter behavior and plan to research into exactly how this filter works and what it is doing. The result of this filter with the cut off frequency set to 0.1Hz can be seen in below.

Not the greatest filtering

Not the greatest filtering

After this I created my own high pass filter from scratch. I had made filters in MATLAB before for a class I took on it so I took that code and modified it for this purpose. The filter is a crude brick wall high pass filter. I started by performing a fast fourier transform (fft) on the data and identifying the frequencies of the acceleration due to gravity. This can be seen below.

Result of fft of z data

Result of fft of z data

Unsurprisingly there is a spike at 0Hz, corresponding to the constant acceleration due to gravity. After this I simply removed everything around this frequency and performed an inverse fft and plotted the result, this can be seen in the figure below.

Result of a high pass brick wall filter on the z data

Result of a high pass brick wall filter on the z data

This looks much better than the butter filter, and for this data set it is, however I do not think this sort of brick wall filter is a good idea in the long run. This is because these filtering options are actually sort of similar to the subtraction discussed at the beginning of this post. This means that they will not handle rotation supper well, since in that case it will appear that the acceleration due to gravity is changing. I believe that the brick wall filter will have more trouble dealing with this then more sophisticated filters. That being said I’m not sure that the Butterworth is right chose either. I do not know a lot about digital filters, or signal analysis in general, so over the coming weeks I plan to do a lot of research to get a better idea of exactly what filter I want and what its limitations are.

Ben’s Updates – Optimizations &c.

Hi, everyone. It’s been awhile since I’ve posted and quite a bit has happened, so let’s get started.


When I last posted, the codebase… worked, but it was a mess; everything was crammed into setup() and loop(), making changing one sensor’s functionality a bit of a mess. Sure, architecture can slow your code down, but just a little organization can’t hurt, right?

I figured that I might as well leverage C++ and make each sensor a class implementing an abstract base class Sensor. I didn’t do much more than that; even the ABC is just a way of enforcing consistent method names. I don’t know a lot of C++, but I don’t really need vtables, range for-loops, and the like for this program anyway.

Doing this made things a bit more manageable; I still want to break up the update() loop so that I can easily change when or if data is written. This is because, aside from raw data reads and serialization, my main source of optimization after this will be to skip some SD writes.


Skipping SD writes is pretty simple; I keep a running tab of the variance and mean. If, accounting for gravity, it doesn’t seem to have been moving at any point in the data we just collected, we throw out the data and keep track of how many times we did that. When the shark does start moving, we just write “skipped X writes” to the SD card and continue as usual.

Raw data reads and serialization looks a bit more painful. There’s plenty of overhead and cruft in the libraries that we’re using – for example, reading from the accelerometers reads from the registers, packs them into a short int, stores them, and then converts the ints into floats based on the scale. In reality, we don’t need anything but those 6 bytes of raw accelerometer data, and while that example is relatively minor, the benefits stack over time.

Still, doing that requires a lot of investment; low-level code to read the sensors, a new file format to store the raw data, a Python script to convert the data back into a CSV. All that is for benefits that, while significant, are pretty late-game in the grand scheme of things. We’ve got other things to worry about… namely, storing and serving said data for analysis.

The Server

Due to the amount of data we’re collecting and the rate at which we’re collecting it, I expect us to be collecting (at the most) about 3 gigabytes a month. That’s not huge at first, but we want to re-use these tags and host them on a server so that no one has to constantly keep a local copy on their hard drives.

One solution we have is to adapt tagbase to our needs, replacing the Microsoft access interface with SQL; we actually know the people who put it together and can get some help from them. There are people who have worked with servers before; I haven’t, and I don’t think anyone has really used SQL yet. This will be a fun experience, indeed…


I’ve got plenty of stuff to do here, both now and over the summer. I got approved for a grant, so I’ve got summer housing both at my college and the Chesapeake Bay… we’ll see how this goes. See you guys in a month or so.

Weekly Update 4-6-16 to 4-12-16

This week was spent mostly working on the analysis code, with a brief trip back into the arduino code. I began by cleaning up the code I was using to generate the plots of the acceleration data. When I wrote the code the first time I did not put in enough comments and named my variables in a confusing way so I fixed that. I also put a little time into making the plots generated a little bit nicer to look at by changing the markers used for the data and adding axis titles.

Then I wanted to start thinking about how to remove gravity from the date, but first I had to re-work the code I was using to import the data into MATLAB. Ben had made some changes to format that the data was saved in and I had to reflect that in the MATLAB import code. While doing this though I realized there were some other things I wanted to change about the format of the saves data and figures that it was a good time to do it so I wouldn’t have to change the MATLAB import code again. These changes took a little bit longer than intended because I had not been keeping up with Ben’s development of that code as closely as I should have been. I don’t consider this lost time though because I accomplished what I wanted to do and have a much better idea of exactly what Ben’s ben up to.

Once the Arduino code was back in line I finished the MATLAB import code. Then I began to think about calibration data of the accelerometer. The accelerometer was calibrated in the factory but it possible and not uncommon for small and cheap accelerometers to gain some inaccuracy over time. In theory it is not hard to test the calibration of an accelerometer because we live with a well known and constant acceleration all around us, gravity. However an accelerometer can not directly measure the acceleration due to gravity because it measures what is know as proper acceleration, which is just the acceleration relative to free fall. This means that if you place an accelerometer flat on a level surface it will read an acceleration of 1g upwards. This is the acceleration due to gravity but it is in the upward direction because the accelerometer isn’t reading the acceleration due to gravity but rather the acceleration from the normal force caused by the level surface. This leads to two way to test if an accelerometer is calibrated correctly, the first is to place it perfectly flat on a perfectly level surface and confer that it reads exactly 1g in the z direction and 0g in both the x and y. The other option to place the accelerometer in freefall and confirm that it reads 0 on all axis. Both of these options have challenges and I will probably try both next week.

History Part 4

This is my fourth and final post summarizing the work I did on the SharkDuino project before the creation of the blog, the first three parts can be found here.

This post will cover the work I did during the fall semester of 2015 and winter break. This post will be a little a shorter than the previous ones since this project went on a small hiatus during the fall semester for me to focus on school work. At the beginning of this semester we decided that we should start to create prototype sensors to begin getting preliminary data with. To do this we needed to create versions of the system off of the breadboard. To save money we decided that we would remove the pins on the set of components we had been working with so that we could reuse them in the prototype sensors. This sounds easy enough in theory but in practice it is difficult to desolder to remove the pins of the breakout board and ended up taking me quite a long time to do. Once I finished desoldering I then tested the components to find out if I had broken them during the process. I did end up ruining the micro SD reader while desoldering by accidentally removing the solder plates needed to make new connections to the board. Overall this process was long and painstaking, and worth it for some of the pricer components, such as the pressure sensor, but for the cheaper components I should not have bothered and just bought fresh ones.

Once all the components had the pins removed it was time to solder them together with wires. To do this first board I simply layed out all of the components on a piece of cardboard and cut holes in it, essentially creating a solderless breadboard. I then connected the components together with wires, splicing when I wanted multiple components to connect to the same pin on the Arduino. The results of this can be seen below, it’s not pretty but it worked.

The first version of the device

The first version of the device

After I finished soldering the device together everything did not immediately work. I started by first uploading a blank sketch the the Arduino to confits that it could connect to the computer. I then ran code to test all of the components individually and found that the accelerometer and temperature sensor were working, but the SD reader and RTC were not. After some long hours going over all the connections with a multimeter I was able to eventually identify some cold solder joints and repair them, once I did this the SD card began to work the RTC still was acting funny. After a little bit more troubleshooting I found that the battery one the RTC had died, I replaced that then got more normal behavior from the RTC when it was used alone.

Now with the hardware working it was time to get to work on the software. I found that all of the components would not work together properly. The SD would work with the accelerometer and temperature sensor, but once the RTC was added the SD card would break. The cause of this problem and my solution to it are the topics of my first weekly update.

This brings me up to when this blog began and I began posting weekly about the nitty gritty of the project. In this series I tried to give an overview of the work done without getting too bogged down in the details, but in doing so I am sure I left out some important information. If anything else of importance comes to mind, or I find some lost notes I will be sure to post about it. Regardless I hope this series has been informative about my early work on this project and provides useful context to the work being done now.