AVS Forum | Home Theater Discussions And Reviews (https://www.avsforum.com/forum/)
-   DIY Speakers and Subs (https://www.avsforum.com/forum/155-diy-speakers-subs/)
-   -   An rpi based DIY Vibration meter (https://www.avsforum.com/forum/155-diy-speakers-subs/2681865-rpi-based-diy-vibration-meter.html)

3ll3d00d 12-16-2016 12:43 PM

An rpi based DIY Vibration meter
 
Current Release: 0.3.8

binaries for OSX, Windows and certain flavours of linux available via https://github.com/3ll3d00d/qvibe-an...eleases/latest

refer to https://github.com/3ll3d00d/qvibe-analyser/releases for a complete list of all published releases

refer to https://qvibe.readthedocs.io/en/latest/ for documentation about how to install and configure the app & the required hardware

Features
- collecting data from 1 or more connected sensors concurrently
- charts available for each measurement
- time series data (vibration, tilt, raw data)
- frequency response data (spectrum, peak spectrum and psd)
- RTA view
- live spectrogram view
- measurements can be analysed by axis of vibration or using a summed response (calculated using a root sum of squares method with more weight placed on x and y axis vibration)
- allows user to compare 1-n data sets in a single graph

Spoiler!

derrickdj1 12-16-2016 03:10 PM

You have to post a lot pic's along the way.:) This may be a first on this thread!

dominguez1 12-16-2016 04:45 PM

This will be awesome...sub'd. :cool:

FriscoDTM 12-16-2016 06:02 PM

This is a great project - I almost picked up one of the arduino accelerometer boards to play with but figured it would sit on the shelf and was beyond my skill level. It will be very cool if you can use it to time align subs and transducers to maximize constructive interference.

BassThatHz 12-16-2016 07:06 PM

Sounds like a neat project.
The thought has crossed my mind before. But then I got lazy and said meh to myself. :o

Quote:

Originally Posted by 3ll3d00d (Post 49038585)
Software
- python on data analysis duties (numpy & scipy seem to do everything that is really necessary)
- python/c to collect the data via the i2c bus (a few people have posted code on github that provides working copies of this)
- python on webapp/microservice duties (flask?)
- reactjs for the front end

I'm no python expert but flask and reactjs sounds bloated.

Research this way instead, it should run faster and with less headaches.

import asyncio
import websockets
http://websockets.readthedocs.io/en/stable/intro.html

Code:

#!/usr/bin/env python

import asyncio
import datetime
import random
import websockets

async def time(websocket, path):
    while True:
        now = datetime.datetime.utcnow().isoformat() + 'Z'
        await websocket.send(now)
        await asyncio.sleep(random.random() * 3)

start_server = websockets.serve(time, '127.0.0.1', 5678)

asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()

Code:

<!DOCTYPE html>
<html>
    <head>
        <title>WebSocket demo</title>
    </head>
    <body>
        <script>
            var ws = new WebSocket("ws://127.0.0.1:5678/"),
                messages = document.createElement('ul');
            ws.onmessage = function (event) {
                var messages = document.getElementsByTagName('ul')[0],
                    message = document.createElement('li'),
                    content = document.createTextNode(event.data);
                message.appendChild(content);
                messages.appendChild(message);
            };
            document.body.appendChild(messages);
        </script>
    </body>
</html>

I would code two websockets:
1) one to serve the above HTML to the browser
2) and the above websocket to do the real-time HTML/async JavaScript data updates.

Not a whole lot of code is it? It is pretty much already coded for you! ;) Just replace the UTC time code sender with the actual content you want to display.

I'm not sure what lib's Python has, but surely there must be either a chart Jpeg rasterizer or maybe HTML5 vector graphics if you want to get real fancy.
(Unless flask and reactjs does that already... in which case: as you were...) ;)

BassThatHz 12-16-2016 07:46 PM

Quote:

Originally Posted by 3ll3d00d (Post 49038585)
Software
- python on data analysis duties (numpy & scipy seem to do everything that is really necessary)
- python/c to collect the data via the i2c bus (a few people have posted code on github that provides working copies of this)
- python on webapp/microservice duties (flask?)
- reactjs for the front end

There is only one gotcha that might not be obvious to you.

Not sure how much web server development you have done, but they are ALL multi-threaded by default. No matter what framework/API/Lib you use.

So that async function pointer for example, will be running in a multi-threaded context.
You'll have to ensure your IPC's are thread-safe or apply a locking strategy.

I'm more of a C# person myself, and in that language you could either make the class\functions\variables static and then read the static data variable/array (a dirty read-only operation).
Both of which are thread-safe and require no locking.
(In the windows world you have to do another step: Set IIS to never kill the net worker process.)
Whatever the Python equivalent of that is...

I know C++ and C# have a static keyword. Not sure about Python or C though...

If you can do static, then you don't need IPC's as there is only one instance to deal with. Which may make your life easier (and the code: faster).

That said, if you make the app state-less, you might be able to just "deal with" the overhead of instantiating a new object reference for each async web request. That's kind of a lazy-man's way of achieving the above though.

At the end of the day it's up to you which direction you want to go. I just gave you 3 possible options.
If you care about performance I'd personally do static if you can, or IPC with locking if not, and state-less as a last resort.

BassThatHz 12-16-2016 09:54 PM

Quote:

Originally Posted by FriscoDTM (Post 49047369)
to time align subs and transducers to maximize constructive interference.

The speed of sound through wood is much higher than air.
Like 10,000ft per second vs 1,000ft per second.
So for every foot of air distance to your sub you'd probably have to add 3-7ms of delay to the transducer. (At least, in theory...)

3ll3d00d 12-17-2016 02:51 AM

the UI is going to be pretty simple given that if I were writing this for purely personal use, the UI would be a bunch of scripts :) It will basically be a couple of forms that let you select a dataset to work with, choose what sort of graphs you want to see, perhaps allow you to override some of the analysis parameters and then show you some graphs. There's no realtime interaction with the measurements like some sort of rpi speclab in a browser, it's purely offline analysis with perhaps some visual indicator that a measurement is in progress. As such there are no concurrency concerns on that front. We'll see how fast the pi is at reading & writing the data out though. The MPU-6050 has a FIFO buffer which means you only need to read a chunk of data about every 600ms or so, I can imagine needing to hand the data off to another thread to write to disk but we'll see.

flask looks fairly lightweight as far as I can see, never used it but the assorted examples look pretty simple. There is an even simpler alternative called bottle mind you. I'll just see how it goes. react is pretty simple as well and it makes writing a ui pretty simple. Ultimately I just want to knock out a functional ui as quickly as possible.

The main issue I see at the moment is cable length from the GPIO to the accelerometer, an i2c signal seems v sensitive to cable length if you want high sample rates so cables are typically ~20cm at most which is not very far at all & might make isolating the accelerometer from the rpi tricky. This would be a good argument for getting an accelerometer with an analogue output and then feeding that back in through an ADC. It's probably a good thing I have an accelerometer already as I will be able to compare results from the two sources and then make a judgement on how to proceed.

I'm not sure if SPI can support longer cables, if so then the MPU-6000 is an option as that is basically the same device with an SPI interface. There is also a newer MPU-9250 but they don't generally seem readily available on a breakout board. I did find https://drotek.com/shop/en/home/264-...out-board.html though. I'll probably go with the i2c version to begin with and then revisit if the cable length is a real issue.

coolrda 12-17-2016 07:53 AM

This could be the next REW. Nice job 3. Looking forward.

3ll3d00d 12-17-2016 09:55 AM

added a suggested feature list to the 1st post, can't promise how long it will take me to get round to writing all this mind you :)

BassThatHz 12-17-2016 11:48 AM

One feature you'll want is the ability to subtract gravity from the x, y or z axis.

Kinda bummed that it won't be real-time analysis though. Me sad.

Not sure how fast or how much RAM the pi has, but worst-case you could offload it to a network attached PC.
I doubt you'd need to go that far though.

It sounds like you might have done signal analysis coding before? Because if not, you could be in for a long ride.

3ll3d00d 12-17-2016 01:34 PM

Quote:

Originally Posted by BassThatHz (Post 49063929)
Kinda bummed that it won't be real-time analysis though. Me sad.

something to add at a later date perhaps, though tbh most of this data is just short clips which you want to characterise as a whole so offline analysis is sufficient as far as I can see.

Quote:

Originally Posted by BassThatHz (Post 49063929)
It sounds like you might have done signal analysis coding before? Because if not, you could be in for a long ride.

I've got all the basic analysis worked out in some simple scripts, it's pretty much all done by scipy tbh so mostly it's just a case of manipulating the results. I need to get some more data and compare it to (e.g.) speclab to be sure the results are correct but certainly seems to be so far.

BassThatHz 12-17-2016 04:53 PM

Quote:

Originally Posted by 3ll3d00d (Post 49053905)
it's purely offline analysis with perhaps some visual indicator that a measurement is in progress. As such there are no concurrency concerns on that front.

How do you plan on notifying the browser that the offline analysis has finished?

BassThatHz 12-17-2016 05:27 PM

Quote:

Originally Posted by 3ll3d00d (Post 49038585)
- stop an ongoing measurement

The HTTP responder will be running under a different thread, possibly even a different process than your acceleration analyzer (depending on how you code and host your python modules, that is...)

So how do you plan on "stopping an ongoing measurement" without running into a concurrency issue? As separate threads and/or processes usually don't have shared memory (and are running in parallel). Which will force you to implement a thread-safe IPC.

Not the end of the world, but you'd have to abandon this feature if you don't plan on implementing something to resolve the above inter-process inter-thread issue.

A static singleton getter for the acceleration analyzer module would be a common approach instead of using IPC's. That way everything can be contained in a single host process instead of multiple.

On second thought: I suppose you could do a pkill on it. (That would certainly stop the measurement. :D)

3ll3d00d 12-18-2016 03:32 AM

Quote:

Originally Posted by BassThatHz (Post 49072073)
The HTTP responder will be running under a different thread, possibly even a different process than your acceleration analyzer (depending on how you code and host your python modules, that is...)

So how do you plan on "stopping an ongoing measurement" without running into a concurrency issue? As separate threads and/or processes usually don't have shared memory (and are running in parallel). Which will force you to implement a thread-safe IPC.

sorry, I wasn't especially clear in my earlier post. When I said "no concurrency issues" here, I meant that the system is pretty much a bunch of synchronous operations so there isn't much to think about on that front. This doesn't mean there aren't multiple things happening in parallel though, the main one being the UI as you suggest.

I was going to handle this by having some sort of MeasurementDevice component which has a event driven api and an associated simple state machine to model what is going on in there. This means the UI can submit START or STOP commands and get the current state of the component. This component will run in separate process (as I understand python has a global lock problem) and may end up needing to fork itself into two (one for reading, one for writing) processes. Since there is only one measurement device & all it needs to do is signal when some output is available on disk then the standard Pipe (https://docs.python.org/3/library/multiprocessing.html) looks like it will be sufficient. Moving to some "realtime" visualisation would just mean switching the reader->writer event from a queue to a pub-sub so that an analysis component can see segments of data as they arrive in parallel with the writer. It would be a basic stream processing situation at that point.

I haven't thought especially deeply about any of this though, I was assuming any modern language would give you the tools required to do this sort of thing without having to plan ahead too much.... :)

BassThatHz 12-18-2016 06:05 AM

Quote:

Originally Posted by 3ll3d00d (Post 49079113)
sorry, I wasn't especially clear in my earlier post. When I said "no concurrency issues" here, I meant that the system is pretty much a bunch of synchronous operations so there isn't much to think about on that front. This doesn't mean there aren't multiple things happening in parallel though, the main one being the UI as you suggest.

I was going to handle this by having some sort of MeasurementDevice component which has a event driven api and an associated simple state machine to model what is going on in there. This means the UI can submit START or STOP commands and get the current state of the component. This component will run in separate process (as I understand python has a global lock problem) and may end up needing to fork itself into two (one for reading, one for writing) processes. Since there is only one measurement device & all it needs to do is signal when some output is available on disk then the standard Pipe (https://docs.python.org/3/library/multiprocessing.html) looks like it will be sufficient. Moving to some "realtime" visualisation would just mean switching the reader->writer event from a queue to a pub-sub so that an analysis component can see segments of data as they arrive in parallel with the writer. It would be a basic stream processing situation at that point.

I haven't thought especially deeply about any of this though, I was assuming any modern language would give you the tools required to do this sort of thing without having to plan ahead too much.... :)

pub-sub particularly lends itself well to web-servicized back-ends. As it allows multi-machine distributed nodes. Probably overkill for a single Pi box though.
That's something that you'd typically see in a b2b integration, SOA, or a cloud-attached super-computer architecture.

NASDAQ for example uses one-way pub-sub pushes for it's global architecture if I recall correctly. But we are talking millions of feeds and desktops and business hedge fund servers and traders. A much larger scale than a single Pi box.

An asynchronous-event model would be a faster way, if you have a (global) controller using that process-lib you referenced and with no webservice backend.

You can do pub-sub without a webservice of course, the only other time pub-sub is handy in that case is when you are trying to loosely couple many active listener classes. Not sure how useful that model would be in a one-off Pi project with a single user (i.e. only 1 subscriber).

You did mention that you were leaning towards web services though.
I know that in c# you can have both async and sync web-services too, not just events/function pointers.

Not sure how python handles events, but in c# they are basically function pointers under the cover.
Events in c# don't ensure thread-safety by default, so when interacting with the UI across threads, you still need to write code like this:
Code:

private void SetText(string text)
{
        if (this.textBox1.InvokeRequired)
        {       
                SetTextCallback d = new SetTextCallback(SetText);
                this.Invoke(d, new object[] { text });
        }
        else
        {
                this.textBox1.Text = text;
        }
}

This is c# windows-forms specific, but asp.net web-apps have a similar issue.
How python handles this I'm not sure, but I'd imagine it would have a similar issue and resolution in a pythony way.

In c# you can have synchronous and asynchronous function pointers / event handlers.
async is thread-safe, and sync isn't (at least in c# land it is that way, I'd imagine python is perhaps similar in this regard... but maybe not.)

I'd imagine that the process-lib you referenced is doing some sort of IPC's under the cover, so needing to worry about thread-safety may not be necessary as it is already handling that for you; allowing you to get away with using natively non-thread-safe sync-events in a multi-process multi-thread environment.

I hope that your planned model works for you without too much headache.

In any case, these some things to think about and prototype up before diving head first or committing deeply to any particular model that may or may not react as first planned.

I've seen a lot of programmers bang their heads against a wall when first diving into the world of web-apps and web-services when coming from traditional command-line/forms backgrounds, or back-end vs UI. (i.e. default single-threaded vs multi-threaded enivro's, and real-time vs disconnected request-reply models.)

BassThatHz 12-18-2016 06:38 AM

Quote:

Originally Posted by 3ll3d00d (Post 49079113)
sorry, I wasn't especially clear in my earlier post. When I said "no concurrency issues" here, I meant that the system is pretty much a bunch of synchronous operations so there isn't much to think about on that front. This doesn't mean there aren't multiple things happening in parallel though, the main one being the UI as you suggest.

Any SSH terminal will have it's own process and threads, both desktop client and Pi hosted ssh server.
The same goes for any browser like Chrome, each tab launching yet another thread.

Then you have your data-analyzer process/thread and measurement-device process/thread, and any given backend web-service process/thread and UI web-server process/thread, all running on the Pi (depending on how it is all coded of course...)

So there is potentially many different processes and threads all interacting with each other here. (and all taking up ram and CPU usage too I might add.)

From a pure efficiency/speed perspective I would do: one web process, that manages the the file-writer thread, the analyzer thread and the measurement thread.
So 1 process and 3 threads. All of which would be a static singleton pattern. Possibly with a factory pattern for loose class coupling if you desire that.
FYI: I would just use that simple Python websockets class, rather than any bloated apache server processes or anything heavier-duty like that.
That would keep it lightweight and as simple as possible. (Likely, eliminating the need for reactjs/flask entirely...)

But that would only work well if Python behaves itself when managing internal threads...
but as you mentioned: perhaps it doesn't... thus forcing you to use the process-lib way instead, and thus 4 processes each with 1 thread. (?)

3ll3d00d 12-18-2016 07:00 AM

fwiw I've been writing those sort of (large scale high throughput and/or low latency) systems for years, just never had a need to use python before. This means that most of the time I put into this so far seems to be spent working out what the idiomatic python equivalent is. This invariably then sends me off on a tangent, probably should concentrate on getting it done :)

on the multiprocess thing, Python, or at least CPython, seems to enjoy something referred to as the GIL (https://wiki.python.org/moin/GlobalInterpreterLock) apparently because the memory management isn't thread safe. This means you have to use multiple processes if you want to actually allow things to execute in parallel.

BassThatHz 12-18-2016 07:19 AM

Quote:

Originally Posted by 3ll3d00d (Post 49081817)
fwiw I've been writing those sort of (large scale high throughput and/or low latency) systems for years, just never had a need to use python before. This means that most of the time I put into this so far seems to be spent working out what the idiomatic python equivalent is. This invariably then sends me off on a tangent, probably should concentrate on getting it done :)

on the multiprocess thing, Python, or at least CPython, seems to enjoy something referred to as the GIL (https://wiki.python.org/moin/GlobalInterpreterLock) apparently because the memory management isn't thread safe. This means you have to use multiple processes if you want to actually allow things to execute in parallel.

Well in that case... how much harder would it be to just code the whole thing in C++?
You'd have to port NumPy/I2C/websockets etc over to C syntax, but that shouldn't be "too hard", just time consuming...

Then you'd have complete control over the whole world. No silly GIL to deal with.You could then take advantage of pointers, rather than some Python memory manager.

Just a thought! :)

3ll3d00d 12-22-2016 06:45 AM

the hard bit for me is doing the signal processing so doing it in python makes sense as it basically does it all for you. Besides using a different language from the day job keeps it interesting :)

3ll3d00d 12-22-2016 06:45 AM

I've been researching the signal processing side of things to check that the libs available can do the job we need without me having to actually write any serious amount of code, seems like the answer to that one is yes (which is nice).

It has got me thinking about the actual requirements here and I think we really have 2 different use cases.

The first use case is system calibration which uses periodic white noise to characterise the response of the system to a "flat" input. We then want to see the response of the system and compare it against a (pre) defined target curve. Ideally the system would then tell us "do this to map to the target curve" (c.f. the REW auto eq window).

To support this we need a linear spectrum graph, basically the vibration equivalent of a magnitude response graph (G vs time). We then want to plot the response on individual axes along with the summed response and the target curve. We also want a graph that shows the difference between those two (probably summed / target). Finally a simple solution to the auto eq problem is to invert that graph and this is a starting point for the filter to apply to your TR devices.

I don't see any need for any other graphs in this situation, a spectrogram is pointless as is any peak hold graph and PSD seems a waste of time too.

Secondly we have real world content. This is fundamentally different to the calibration signal as it is non periodic and tends to have fairly large dynamic range, at least in the passband we're interested in. This means that a linear spectrum is not so useful because it's basically an average of the entire track by frequency and what we're really interested in is the peaks and how it is sustained over time. I think this means that we really want to see a spectrogram, a peak values equivalent of the linear spectrum and possibly the PSD (I'm not really sure whether this is useful in this context tbh but it's easy to provide). I also think we're only really interested in the summed vibration here as opposed to individual axes as the total effect is what we're interested in.

The interesting thing here is that we could also run a wav of the source clip through the same analysis and then present the two side by side and perhaps also show the difference between the two signals. This would then tell you how far away from the reference you are with real world content. I imagine the difficulty in doing that will be aligning the start and end time of the two signals, no idea if it's feasible to do that automatically. It would be a nice feature though.

Thoughts?

3ll3d00d 12-22-2016 02:38 PM

2 Attachment(s)
The basic analysis functions are implemented along with some simple wav or txt file parsing, details in https://github.com/3ll3d00d/vibe/blo...n/vibe/vibe.py

using the EoT scene as an example; red is peak, green is linear spectrum, blue is PSD

https://www.avsforum.com/forum/attach...1&d=1482446177

and a spectrogram using a slightly funky colour scheme

https://www.avsforum.com/forum/attach...1&d=1482446257

these are just example graphs generated using matplotlib in python btw.

notnyt 12-22-2016 03:20 PM

If you're using an rpi you can easily measure SPL at the same time as well with a cheap mic capsule.

BassThatHz 12-22-2016 06:31 PM

Quote:

Originally Posted by 3ll3d00d (Post 49199305)
I've been researching the signal processing side of things to check that the libs available can do the job we need without me having to actually write any serious amount of code, seems like the answer to that one is yes (which is nice).

Many people take this approach. Prototype each module in isolation, and then mix them together later on.
That way the sum of the pieces are "almost" guaranteed to do what you want.

Quote:

Originally Posted by 3ll3d00d (Post 49199305)
on individual axes along with the summed response

Just make sure you use 3d vector summation, rather than absolute value.

Otherwise your scaling will violate the laws of energy conservation, it will be way off, which will give you nightmares in the target curve and EQ stages.

Quote:

Originally Posted by 3ll3d00d (Post 49199305)
I imagine the difficulty in doing that will be aligning the start and end time of the two signals, no idea if it's feasible to do that automatically. It would be a nice feature though.
Thoughts?

Not difficult, just play a chirp/burp and time align the peaks. Then play the signal and perform sig-analysis using those relativistic timing offsets.

You may want to use 3 different frequency burps and average the offsets (Say 15hz, 30hz and 60hz). That way frequency related anomalies will be reduced.

Once feature you might want, is to display this offset so that you can calculate and thus add DSP delay to the tactile transducer or sub for time alignment.

3ll3d00d 12-23-2016 05:38 AM

Quote:

Originally Posted by BassThatHz (Post 49219105)
Just make sure you use 3d vector summation, rather than absolute value.

Otherwise your scaling will violate the laws of energy conservation, it will be way off, which will give you nightmares in the target curve and EQ stages.

It is not clear what the right approach to summing multi axis vibration is though it is clear that summing it is important if you want to improve the correlation between subjective perception and objective measurements. The relevant standards appear to be BS6841 which I believe evolved into ISO2631. Generally speaking these speak about applying a frequency weighting (which varies by axis and measurement type) and a scaling factor (which varies by axis). Some reports say use both, others say just use the scaling factor. There is also disagreement on the correct scaling factors too.

If I decide to go with frequency weighting then https://www.mathworks.com/matlabcent..._body_filter.m provides some matlab code to generate them, I imagine this can be adapted to python.

One method that I've seen suggested is a root sum of squares method, e.g. as per https://dspace.lboro.ac.uk/dspace-js...REPOSITORY.pdf (which also gives some suggested scaling factors), and another is VDV (vibration dose value), e.g. as per http://www.auburn.edu/~kam0003/347%20Binder1.pdf (which also gives a method to sum VDVs). There's also https://dspace.lboro.ac.uk/dspace-js...ndle/2134/6250 which is someone's PhD on the subject :)

Quote:

Originally Posted by BassThatHz (Post 49219105)
Not difficult, just play a chirp/burp and time align the peaks. Then play the signal and perform sig-analysis using those relativistic timing offsets.

You may want to use 3 different frequency burps and average the offsets (Say 15hz, 30hz and 60hz). That way frequency related anomalies will be reduced.

Once feature you might want, is to display this offset so that you can calculate and thus add DSP delay to the tactile transducer or sub for time alignment.

I'm not doing playback from this software, just measurement, thus it's a problem of aligning some arbitrary measurements.

3ll3d00d 12-23-2016 01:23 PM

I think that point about the scaling factors would make a diff between measured and target acceleration very hard to implement as it implies "start here for your eq adjustment" but it won't have that effect (even if the seat responds linearly).

BassThatHz 12-25-2016 06:04 PM

The heavier and more dense the object, and/or the more lossy... the less the vibs will scale (of any method).
The scaling would literally have to change from material to material and object to object.

Getting this to be automated will be impossible. It will probably only work for you, in your room, with your system/objects.

Not sure what you have for a floor, but mine is 24inches of reinforced concrete. The vibs I get are typically air induced only.
Someone with a 2x4 floor on a 2nd-story rickety 200yr old house, will have far more vibs (wanted or unwanted).

Unfortunately your app won't translate at all from house to house, or even basement to upstairs.
The scaling would have to be manual and arbitrary. That's just the way highly-variant mechanical systems are. Is what it is.


I have no idea how linux usb audio works, but I'd imagine someone has solved this python PI audio problem already (and likely open-source).
Just a matter of doing it. You are already going to all this work to DSP streams of data and wave files. Having an audio I/O is really not much more work.
The chirp/burps could be either static wave files you pre-build or based on a sine-generator.

All the cellphone vib apps have real-time monitoring.
Not having this app be real-time nor interactive really dampers the sex appeal and usefulness IMO.
People are impatient these days, they like to see the results in the now. They want to boom their subs while looking at real-time charts updating.

Besides, being able to adjust the scaling in real-time to match targets would be EXTREMELY handy... otherwise the process will be slow and painful.
Run, look at the offline results. Nope, insufficient.
Re-run, look at the results. Nope, still insufficient.
Re-run, look at the results. Nope, still insufficient.
Re-run, look at the results. Bangs head against wall etc etc

Instead of just:
Run it; and watch it immediately meet or not meet expectations.

It just needs to do 1 or 4 updates per second @ 1 to 16k.
It's not like it has to do 120fps @ 512k...

3ll3d00d 12-26-2016 02:52 AM

Quote:

Originally Posted by BassThatHz (Post 49282537)
The heavier and more dense the object, and/or the more lossy... the less the vibs will scale (of any method).
The scaling would literally have to change from material to material and object to object.

Getting this to be automated will be impossible. It will probably only work for you, in your room, with your system/objects.

the scaling and summation is to account for human perception of vibration, the material through which it vibrates is irrelevant. You may want to read the links I provided.

Quote:

Originally Posted by BassThatHz (Post 49282537)
Not sure what you have for a floor, but mine is 24inches of reinforced concrete. The vibs I get are typically air induced only.

there is no known DIY way to measure pressure response directly, this is for vibration only.

Quote:

Originally Posted by BassThatHz (Post 49282537)
Unfortunately your app won't translate at all from house to house, or even basement to upstairs.

I know. Why do you think this is relevant?

Quote:

Originally Posted by BassThatHz (Post 49282537)
I have no idea how linux usb audio works, but I'd imagine someone has solved this python PI audio problem already (and likely open-source).
Just a matter of doing it. You are already going to all this work to DSP streams of data and wave files. Having an audio I/O is really not much more work.
The chirp/burps could be either static wave files you pre-build or based on a sine-generator.

I know it can be done, I have no known need for it though as I have no intention of using the rpi as the audio source.

Quote:

Originally Posted by BassThatHz (Post 49282537)
All the cellphone vib apps have real-time monitoring.
Not having this app be real-time nor interactive really dampers the sex appeal and usefulness IMO.
People are impatient these days, they like to see the results in the now. They want to boom their subs while looking at real-time charts updating.

this made me chuckle as the expected audience for this app is somewhere close to 1. The number of people using VS regularly today on AVS is in the single digits and that's pretty much completely trivial to use. As such I'm not expecting a large audience for a solution that involves buying an rpi + an accelerometer on breakout board, wiring it together and manually installing and configuring a python app.

Quote:

Originally Posted by BassThatHz (Post 49282537)
Instead of just:
Run it; and watch it immediately meet or not meet expectations.

It just needs to do 1 or 4 updates per second @ 1 to 16k.
It's not like it has to do 120fps @ 512k...

speaking as someone who has calibrated a nearfield setup using an accelerometer and an RTA view of its response, I can say that an RTA is nice to have but far from essential when calibrating. The process can involve multiple seating positions, multiple measurements (TR and FR), you may not have the ability to make real time changes to your EQ anyway and deciding what to do can be an offline process (i.e. requires thinking time). RTA also has the downside that it tends to make it hard to remember what you've changed and when you changed it so repeatability is difficult. RTA view is good for a quick sanity check and some simple twiddling (e.g. level setting). Finally RTA view is completely useless for comparing real world content clips.

These are the same arguments as to why people use sweeps rather than RTA in REW btw so there is nothing new to see here.

3ll3d00d 12-26-2016 04:03 PM

1 Attachment(s)
put together the code to talk to the device, enable/disable sensors, use the onboard FIFO and also streams data out to a handler callback in chunks to facilitate the multiprocess handling. I added a bunch of unit tests so it should work if the device behaves as expected (famous last words) -> https://github.com/3ll3d00d/vibe/blo...ibe/mpu6050.py

I also have the device itself in hand, it's surprisingly tiny

https://www.avsforum.com/forum/attach...1&d=1482796958

need to work out the wiring next and hook it up next

andy497 12-27-2016 11:17 AM

I'll be very interested in what kind of data you get out of that MPU-6050 (besides also being very interested in the project in general). Those units are very popular in DIY quad coptor flight controllers. They are typically pretty noisy and with biased distribution, so the the best data comes after sensor fusion/kalman filtering with multiple other sensors like gyro/gps/barometer. Hopefully this application has much higher SNR and that's not a problem. You'll have the advantage of being able to measure a stable periodic signal where you can average samples.

Also, I think those chips advertise a sample rate of 1000 hz, but they may or may not have a non-defeatable hardware digital filter operating at 256 hz. Even keeping the test signal frequency well below that, harmonics will be getting in from everywhere and may flood the measurement with aliasing (i.e. 2*nyquist only applies with band-limited input). Or not. The DIY quad folks are making great strides in autonomous flight, and you can imagine a plate with four spinning blades on it is pretty riddled with high frequency vibration.

3ll3d00d 12-27-2016 02:29 PM

I hadn't noticed that bit of the doc, it seems slightly ambiguous as it says

The Sample Rate is generated by dividing the gyroscope output rate by SMPLRT_DIV:
Sample Rate = Gyroscope Output Rate / (1 + SMPLRT_DIV)
where Gyroscope Output Rate = 8kHz when the DLPF is disabled (DLPF_CFG = 0 or 7), and 1kHz when the DLPF is enabled (see Register 26)


i.e. implies the DLPF can be disabled

but then register 26 says that 0 means a 260Hz filter and 7 is reserved

Ultimately we only really need up to ~100Hz so we'll just have to see what comes out when I measure.

andy497 12-28-2016 09:38 AM

The table for register 26 DLPF_CFG is confusing at best. Setting to 0 seems to indicate disabled, but where are they getting the 260 hz for accel and 256 hz for gyro from?

3ll3d00d 12-28-2016 12:19 PM

1 Attachment(s)
Quote:

Originally Posted by andy497 (Post 49343649)
The table for register 26 DLPF_CFG is confusing at best. Setting to 0 seems to indicate disabled, but where are they getting the 260 hz for accel and 256 hz for gyro from?

I'm not sure where the specific values come from but I guess the non defeatable nature of the filter is due to the ADC in the chip itself, i.e. a LPF is required to combat aliasing.

Attachment 1858369

I read the filter is a 1st order filter so that would leave us down about 6dB at Nyquist (given a 1kHz sample rate) and attenuation would start just above the passband we're interested in (which ends in the 80-100Hz range). I think this means it's not an issue for our use case.

I also searched a bit for info on the noise problem and it seems there are (were? some batch of boards out there which used the wrong capacitors, seems fixable if you have smd soldering skills (I don't!) - https://forum.arduino.cc/index.php?topic=394691.0

3ll3d00d 12-28-2016 02:59 PM

I've been thinking about how to mount the chip itself without the, relatively short, cables unduly influencing how it moves. My thinking is to solder a right angle header onto the board to attach the wires to then use double sided foam tape to attach the board itself to a somewhat larger foam base. Something like this bad ascii art

Code:

                                                |----|
                      pins    cable            |    |
            board  |-----=====------------------|  |-G------------|
            ========|                              |      RPi    |
  tape      ---------                              |--------------|
        ****************
  foam  ****************
        ****************
  tape  ----------------
  seat  ===============================================================

It might actually make sense to recess the board into the foam so the header pins and cable sit flush with the foam.

The foam could then be used as a safe base for adhering the device to the seat. I think double sided tape should work here as well (e.g. see https://www.endevco.com/wp-content/uploads/TP312.pdf for comparison of mounting methods, indicates tape works fine in this passband).

Intuitively this seems like it should work but we'll see. I could solder the wires directly to the board instead though that feels like it might be more unstable because the wires will roam free whereas the header pins serve to lock the wires into a consistent position. I suppose you could also argue the consistency of the wires aligned to the pins might introduce a systemic bias. You might also argue I'm overthinking this :)

BassThatHz 12-28-2016 06:17 PM

Quote:

Originally Posted by andy497 (Post 49317665)
gyro/gps/barometer. you can imagine a plate with four spinning blades on it is pretty riddled with high frequency vibration.

I would imagine that the barometer and gps aren't greatly impacted by drone motor vibs.
I'm no drone expert but I'd imagine they only need +-0.5 degree accuracy for the gyro to facilitate pitch/yaw/roll.

Small changes in position based would have to be done via velocity/acceleration monitoring as civilian GPS isn't uber accurate (+-2ft at best). It would have to be accurate to within "inches per second" if say... they wanted to autonomously fly it through a window frame.
The largest noise source being caused by local wind, and blade turbulence off nearby objects, which would be picked up by the tilt changes in the gyro to thus reject large false-positives (as well as fan voltage I'd imagine.)

In the case of audio though, I'd imagine the gyro/gps/barometer would be completely useless.
If a bulky cellphone can register transducer vibs I'd imagine a smaller, lighter, dedicated-chip, would be even more accurate than that, by a large margin.
I'm sure the sensitivity will be just fine so the only real question remaining is: bandwidth...
You'd want at least 100hz, if not 2 or 300hz. The-more the-better obviously.

andy497 12-29-2016 10:01 AM

Quote:

Originally Posted by 3ll3d00d (Post 49348457)
I also searched a bit for info on the noise problem and it seems there are (were? some batch of boards out there which used the wrong capacitors, seems fixable if you have smd soldering skills (I don't!) - https://forum.arduino.cc/index.php?topic=394691.0

Awesome info.


Quote:

Originally Posted by BassThatHz (Post 49358849)
Small changes in position based would have to be done via velocity/acceleration monitoring as civilian GPS isn't uber accurate (+-2ft at best). It would have to be accurate to within "inches per second" if say... they wanted to autonomously fly it through a window frame.
The largest noise source being caused by local wind, and blade turbulence off nearby objects, which would be picked up by the tilt changes in the gyro to thus reject large false-positives (as well as fan voltage I'd imagine.)

Yes indeed. The simple case of position hold is a good test. If you rely on gyro and accelerometer alone, it ends up doing a random walk in the sky. If you LPF that signal too much to try to smooth things, you miss wind gusts and random bumps. Adding gps will help with catching drift over long periods of time, but it can bounce around by several feet second to second, so it's not too good for holding still. So you take as many sensor inputs as you can and try to weight them according to reliability. It's an interesting problem.

But yeah, for this application, the accel is what's needed, and hopefully it has sufficient SNR and bandwidth.

Dood, 90 degree header seems smart to me. I can't really see the wires impacting the measurement unless they are pulled really tight, but I guess you'll find out.

3ll3d00d 12-30-2016 03:15 PM

the recorder service seems ready for action - https://github.com/3ll3d00d/vibe/blo...der/service.py

I wrapped it up in a rest api and added something to report the device self test. It also allows for multiple accelerometers connected to one device (seems unlikely given the pi has a single i2c connection available but you can get expanders) and for the recording & processing of the data to run on a different machine (e.g. if you wanted to run multiple rpi's, one for each seat, so each device will then post the data back to the server).

I just need to actually test it with the device now.

I also had the idea of defining a playlist in jriver and then setting it to automatically work through that playlist while making & processing individual measurements for each track. Add another one to the backlog :)

notnyt 12-30-2016 03:24 PM

Quote:

Originally Posted by 3ll3d00d (Post 49416049)
the recorder service seems ready for action - https://github.com/3ll3d00d/vibe/blo...der/service.py

I wrapped it up in a rest api and added something to report the device self test. It also allows for multiple accelerometers connected to one device (seems unlikely given the pi has a single i2c connection available but you can get expanders) and for the recording & processing of the data to run on a different machine (e.g. if you wanted to run multiple rpi's, one for each seat, so each device will then post the data back to the server).

I just need to actually test it with the device now.

I also had the idea of defining a playlist in jriver and then setting it to automatically work through that playlist while making & processing individual measurements for each track. Add another one to the backlog :)

i2c is a bus, so as long as you can set the address of the device, you can have multiple on the same bus without issue. Otherwise, you'd need a multiplexer.

edit: looked at datasheet, you can easily add a second on the same bus by putting supplying AD0 with 5v which changes the address from 0x68 to 0x69.

If you ever need any assistance with this type of thing hit me up, I have a good deal of experience ;)

3ll3d00d 12-30-2016 03:56 PM

How would you physically connect 2 devices to the RPI?

I had stopped looking into using 2 on one device tbh because the cable length seemed a deal breaker and the bus bandwidth seems like it would be an issue. You could create an extender but seems cheaper, for the layman, to buy another pi instead.

3ll3d00d 12-31-2016 02:23 AM

the wiring is simpler than I thought, e.g. http://electronics.stackexchange.com...-a4-sda-and-a5

cable length still seems an issue though, I imagine making something like http://www.ebay.co.uk/itm/like/17091...7297426&crdt=0 yourself costs pennies if you have the capability to roll your own circuit boards though.

notnyt 12-31-2016 11:31 AM

Quote:

Originally Posted by 3ll3d00d (Post 49426209)
the wiring is simpler than I thought, e.g. http://electronics.stackexchange.com...-a4-sda-and-a5

cable length still seems an issue though, I imagine making something like http://www.ebay.co.uk/itm/like/17091...7297426&crdt=0 yourself costs pennies if you have the capability to roll your own circuit boards though.

keep your wiring for the signal wires twisted it'll help reduce noise. also remember they'll need a common ground and need to be powered still. Also ad0 on one will need 5v

3ll3d00d 01-01-2017 03:05 PM

Quote:

Originally Posted by notnyt (Post 49436281)
keep your wiring for the signal wires twisted it'll help reduce noise. also remember they'll need a common ground and need to be powered still. Also ad0 on one will need 5v

thanks, I'll give it a try once I get one up and running ok.

Speaking of which, I connected up one device and setup the app on the rpi. It is functional and self test passes but seems to be producing way more data than I appear to be able to consume. I don't think this is a function of my code/the rpi being too slow though. I have the bus at default speed (100kbps) and it seems to be taking ~4ms to read 30 bytes from the FIFO, this equates to ~60kbps so seems in the right ballpark. It should only be sticking 6 bytes per sample at 500Hz on the FIFO though, i.e. ~24kbps, instead the FIFO (1kB) is filling up in 35-40ms which is more like 200kbps. Obviously I must have something configured incorrectly so now time to find out what :rolleyes:

EDIT: I guess actually setting the sample rate would help, works fine now. Next step, verify the data.

3ll3d00d 01-02-2017 03:58 AM

1 Attachment(s)
data seems reasonable, need to investigate the self calibration bit next. This is with the device sitting on the desk in front of me and just plots vibration against time

Attachment 1867545

dominguez1 01-02-2017 05:42 AM

Quote:

Originally Posted by 3ll3d00d (Post 49474041)
data seems reasonable, need to investigate the self calibration bit next. This is with the device sitting on the desk in front of me and just plots vibration against time

Attachment 1867545

Awesome progress 3!

Will you be able to show this by frequency as well? Smoothed?

What's the parts cost in USD for this so far? Do you have a pic of the device?

3ll3d00d 01-02-2017 11:30 AM

1 Attachment(s)
graphs are shown further up the thread, a frequency response chart (acceleration vs freq) is one of them

total cost is about £50-55 here for rpi + case + cable + mpu6050 + sd card, the major cost is the rpi and I'm not sure how much they are over there. I guess $60-70 in total?

here's my extremely high tech development platform :)

Attachment 1868977

3ll3d00d 01-02-2017 01:22 PM

I'm not 100% sure what to do next

the correct approach would be to calculate and remove the gravity vector from the output, the downside is this looks slightly tricky if I roll my own (or mildly tedious and time consuming if I port some code from a C++ lib).

an alternative is to exploit the fact that typical use here is "put device on seat, play track" and hack that in via the offsets as per https://www.digikey.com/Web%20Export...are-offset.pdf

this method will fail if the device is subject to significant motion (i.e. it's not firmly secured to the seat) but then the data is rubbish at that point anyway

probably I'll bodge the hack in for now and see what data I get out, if ok then apply the proper approach later

3ll3d00d 01-03-2017 02:04 PM

Further research suggests the following options;

- add support for the onboard Digital Motion Processor (DMP) which fuses gyro and accelerometer data to yield, amongst other things, linear acceleration
Pros: should be accurate.
Cons: unknown algorithm used in the processor and no ability to refine or change it if required, sample rate is limited to 200Hz at most (possibly 100Hz, it's not entirely clear).

- roll my own sensor fusion algorithm based on the raw data.
Pros: can maintain the higher sample rate, can use potentially more computationally expensive (and accurate) filters, should be accurate if I write it properly
Cons: I have to write this myself which entails some risk given that I don't know how (quite a few articles on line how to do it thoough), will be inaccurate if I do a bad job :)

- use a simple low pass filter to calculate gravity per axis and then subtract that from the raw values
Pros: trivial to implement
Cons: seems this must be inherently inaccurate when the sensor is in motion (as this means the actual gravity vector is changing so an estimate of gravity calculated by a low pass filter will be wrong as the estimate lags reality)

still not sure what the best way to go is, the last one is obviously easy to do and is how an app like Vibsensor does it (which means this app should have results that are no worse than VS) so will do that first. I guess it will then mean implementing both the other two options and comparing the 3 approaches to see which one works best in reality.

3ll3d00d 01-04-2017 02:40 PM

4 Attachment(s)
implemented the low/high pass filter approach to isolate gravity and ran a quick test to compare VS against this device. I stuck the device to the table using double sided foam tape and put my phone next to it then measured for 10s while I tapped the table repeatedly.

mpu6050 with a 1Hz 6th order butterworth high pass, 500Hz sample rate

https://www.avsforum.com/forum/attach...1&d=1483568734

VS in high frequency mode (which means a 1Hz corner frequency for the high pass), 100Hz sample rate

https://www.avsforum.com/forum/attach...1&d=1483568734

things to note;

- mpu6050 registers a larger magnitude acceleration, I would think this is due to the reduced weight of the device
- mpu6050 shows a significant oscillation early in the measurement which is the latency of the filter based estimation of gravity (i.e. I need to start each measurement at least 1s early to be able to get a stable view of the resting gravity vector)
- I think that same filter latency is visible in the VS measurement in response to the shock (i.e. me tapping the desk), look at the way Z moves around each spike, not 100% sure but it seems odd, mpu6050 doesn't have that problem as the filtered data clearly tracks the raw data more consistently (and the corresponding "tilt" data is cleaner than that shown by VS). This could also be a function of the weight of the device, i.e. heavier device is harder to fix to the surface so measurement is contaminated by motion of the device itself


tilt comparison
VS
Attachment 1873929

mpu
Attachment 1873937

derrickdj1 01-04-2017 08:21 PM

Is it to early to give a feel on how you think this will work for the VS thread? All of us will not have this meter and yours will have to be our gold standard.:)

dominguez1 01-05-2017 04:37 AM

Quote:

Originally Posted by derrickdj1 (Post 49570817)
Is it to early to give a feel on how you think this will work for the VS thread? All of us will not have this meter and yours will have to be our gold standard.:)

My thought is that we will all need to start using this meter...and 3 could start a little side business to produce these for us, or we could make it ourselves with instructions.

I'm thinking that this will be the tool to use to graph TR and include as part of the ulf scorecard. It won't have the 50hz limit of VS and be more specific to measuring subwoofer TR.

@coolrda , thoughts?

3ll3d00d 01-05-2017 10:38 AM

2 Attachment(s)
here's a comparison of VS using separate 0-50 and 50-100 white noise measurements stitched together vs mpu6050 with 0-100 white noise. I took the raw data from VS and ran it through the same analysis as I have written for the mpu6050. My analysis is consistent with the data produced by VS itself for these exact measurements so I'm confident this is a valid comparison. I couldn't get the graphs to be identical but tried to get it pretty close so you can flip back and forth and visually inspect them.

conclusions...

1) there are significant differences in the data above 30Hz
2) the data from the mpu6050 looks quite similar to that produced by my (analogue, 1D) accelerometer in that the response continues to rise into the mid bass

graphs are PSD graphs only

MPU6050
Attachment 1875513

VS
Attachment 1875521

dominguez1 01-05-2017 10:46 AM

Quote:

Originally Posted by 3ll3d00d (Post 49591561)
here's a comparison of VS using separate 0-50 and 50-100 white noise measurements stitched together vs mpu6050 with 0-100 white noise. I took the raw data from VS and ran it through the same analysis as I have written for the mpu6050. My analysis is consistent with the data produced by VS itself for these exact measurements so I'm confident this is a valid comparison. I couldn't get the graphs to be identical but tried to get it pretty close so you can flip back and forth and visually inspect them.

conclusions...

1) there are significant differences in the data above 30Hz
2) the data from the mpu6050 looks quite similar to that produced by my (analogue, 1D) accelerometer in that the response continues to rise into the mid bass

graphs are PSD graphs only

MPU6050
Attachment 1875513

VS
Attachment 1875521

Awesome work sir. :cool:

How does it subjectively feel 60hz and up? Stronger than your frequencies 40hz and below?

In my case, it is a lot stronger in the lower frequencies than upper...and VS with the downward slope seems to be what I'm feeling.

3ll3d00d 01-05-2017 10:53 AM

Quote:

Originally Posted by dominguez1 (Post 49576817)
and 3 could start a little side business to produce these for us, or we could make it ourselves with instructions.

I intend to publish a deb (i.e. an installable package) which wraps this up so it can run self contained on a single rpi. It will also be possible to break it apart and run n rpi's to measure at multiple and then have them talk to a server process that logs the data and provide the UI. The bill of materials is v simple; 1 rpi + sdcard for raspbian + cables + a case that gives access to the gpio pins. It's then just a question of a bit of config on the device to enable a few bits and pieces, install the deb and away you go (I hope!).

3ll3d00d 01-05-2017 11:06 AM

Quote:

Originally Posted by dominguez1 (Post 49592073)
How does it subjectively feel 60hz and up? Stronger than your frequencies 40hz and below?

the wobble is still the dominant factor, the midrange effect is perceptible though nowhere near the sort of thing mentioned in some of the comments in the MBM thread (the chest pounding stuff). The measurements I've taken in the past indicate this side of things is delivered from the main subs up front btw in my setup. The near field and far field ones complement each other pretty nicely.

dominguez1 01-05-2017 11:15 AM

Quote:

Originally Posted by 3ll3d00d (Post 49593185)
the wobble is still the dominant factor, the midrange effect is perceptible though nowhere near the sort of thing mentioned in some of the comments in the MBM thread (the chest pounding stuff). The measurements I've taken in the past indicate this side of things is delivered from the main subs up front btw in my setup. The near field and far field ones complement each other pretty nicely.

So do you think a downward slope is more representative of what you are feeling then? Does something need to be adjusted in the rpi so the slopes are similar to VS? Or perhaps weight is the cause of the downward slope?

3ll3d00d 01-05-2017 11:33 AM

Quote:

Originally Posted by dominguez1 (Post 49593593)
So do you think a downward slope is more representative of what you are feeling then? Does something need to be adjusted in the rpi so the slopes are similar to VS? Or perhaps weight is the cause of the downward slope?

remember that the shape of the isoperception curve (for vibration) is like

https://msis.jsc.nasa.gov/images/Section05/Image175.gif

i.e. you need a pretty steep increase in acceleration as frequency rises

I intend to add this as a target curve option eventually but I also need to sum the axes as well to get the total picture

I suspect the weight of the device (phone) is what causes the damped response at higher frequencies though it could also be a function of the lower sample rate. I'll take a measurement with the mpu6050 at a 100Hz sample rate at the weekend to determine which one it is.

dominguez1 01-05-2017 12:16 PM

Quote:

Originally Posted by 3ll3d00d (Post 49594313)
remember that the shape of the isoperception curve (for vibration) is like

https://msis.jsc.nasa.gov/images/Section05/Image175.gif

i.e. you need a pretty steep increase in acceleration as frequency rises

I intend to add this as a target curve option eventually but I also need to sum the axes as well to get the total picture

I suspect the weight of the device (phone) is what causes the damped response at higher frequencies though it could also be a function of the lower sample rate. I'll take a measurement with the mpu6050 at a 100Hz sample rate at the weekend to determine which one it is.

Yes, but that is acceleration. If you convert that to PSD, you should see it being flattish from 40-80hz, and a rising slope from 40hz and below.

An acceleration curve with a flat PSD looks like the below I believe...

http://i524.photobucket.com/albums/c...7.png~original

3ll3d00d 01-05-2017 12:44 PM

2 Attachment(s)
Quote:

Originally Posted by dominguez1 (Post 49596073)
Yes, but that is acceleration. If you convert that to PSD, you should see it being flattish from 40-80hz, and a rising slope from 40hz and below.

no I don't think that is correct, it's based on an erroneous interpretation of how PSD is calculated (as a normalisation method for a broadband signal).

Here's the linear spectrum (and the PSD again for comparison) for that data I showed earlier.

Attachment 1875761

Attachment 1875769

dominguez1 01-05-2017 01:08 PM

Quote:

Originally Posted by 3ll3d00d (Post 49597209)
no I don't think that is correct, it's based on an erroneous interpretation of how PSD is calculated (as a normalisation method for a broadband signal).

Here's the linear spectrum (and the PSD again for comparison) for that data I showed earlier.

Attachment 1875761

Attachment 1875769

Sorry if this is redundant...but can you convert that target acceleration curve to PSD? Or, is this the complex math that you described before?

In your eyes, what are you shooting for, for our target chart? Acceleration vs Frequency? PSD vs Frequency?

Whatever it is, I think it's important to compare to a reference curve.

3ll3d00d 01-05-2017 01:44 PM

1 Attachment(s)
Quote:

Originally Posted by dominguez1 (Post 49598073)
Sorry if this is redundant...but can you convert that target acceleration curve to PSD? Or, is this the complex math that you described before?

I don't have this right now, I will produce it though.

Quote:

Originally Posted by dominguez1 (Post 49598073)
In your eyes, what are you shooting for, for our target chart? Acceleration vs Frequency? PSD vs Frequency?

Whatever it is, I think it's important to compare to a reference curve.

I agree that you need to start from a reference, however IMV the real use of that is as a way to inform my understanding of my preference. I don't really know what that reference is in this case, the only one I have found is an isoperception curve but I have no idea if that is a good target or not. I suppose this is one reason for me to build that bandpass sub, i.e to get the ability to deliver more TR higher up and hence have the ability to explore the shape of that preference further (well that and I like building stuff!)

in other news, here's the tri axis sum using the RSS (root sum of squares) method and with X and Y scaled as per https://dspace.lboro.ac.uk/dspace-js...REPOSITORY.pdf

Attachment 1875873

I think the main thing to note here is how the dips are filled in by the XY axes. This does match my perception that my current setup feels quite smooth (i.e. no excessive peaks or dips).

coolrda 01-05-2017 04:28 PM

Quote:

Originally Posted by dominguez1 (Post 49576817)
My thought is that we will all need to start using this meter...and 3 could start a little side business to produce these for us, or we could make it ourselves with instructions.

I'm thinking that this will be the tool to use to graph TR and include as part of the ulf scorecard. It won't have the 50hz limit of VS and be more specific to measuring subwoofer TR.

@coolrda , thoughts?

I think it's great where 3 has taken this. My wish is for realtime display of multiple sensors and to have this run along side OM/REW. I don't have a problem buying a product like this. This could be Calman or Chromapure compared to VS being freeware giving one what HCFR does. Phenominal freeware to get your feet wet but then some move on and up to a more precise measurement platform.

awediophile 01-06-2017 12:54 AM

I just found this thread, and there's a lot to catch up on here. I'll just start with a few comments:

It deserves note what a such a device actually measures vs. what we actually feel. (Many of you are probably already aware of this.) While this device could measure vibration transmitted through floors, furniture, and other objects in contact with our bodies, this device cannot measure tactile sensation that arises from sound (in the air) acting directly on our bodies. The transmission of acoustic energy from air to solid matter depends on impedance matching, and the impedance of different parts of each listeners; bodies will respond quite differently compared to furniture and stuff like that. Of course in practice, plenty of tactile sensation is felt through solids in contact with our bodies, so there is plenty of utility in such a device.

As noted in another post here, the vibrotactile human response curve peaks in the "low mid" frequencies, so being able to measure well beyond 250 Hz is probably a plus. Such capability might be useful for designing speaker enclosures among other things, where vibration is usually undesired.

Python is my primary language, and I would say the technology choices look very sound, except for the ReactJS. Not that there's necessarily anything wrong with ReactJS. I have no opinion. I'm just not a UI guy either. :) However, one package that might be real nice to look at for data plotting via the web is bokeh. What I like is that it includes a few simple UI controls and can interact with a server.

About Python concurrency and multiprocessing: Yes, there is a global interpreter lock (GIL) that prevents multiple threads from executing Python code within the same memory/process space at the same time. However, whether this is a major issue or not depends on your code architecture. IMO, the GIL is overblown as a programming restriction, typically by people who come from a Java or C++ world where the solution to any concurrency problem is almost always to use threads. I am of the strong opinion that using threads for any and all concurrency problems almost always leads to poor performance and/or stability. It's not that threads generically suck but rather that pretty much all programmers suck at writing code that uses threads.

The question is where you need concurrency and where you specifically need parallelism. There are many ways to achieve concurrency in-process in Python despite lack of true parallelism. For example, using libraries/architecture, Python code on a single core can handle on the order of 10000s simultaneous network connections while keeping latencies reasonable. Those who need to scale to multiple cores often need to scale to multiple server instances, and by that point, you need a multi-process architecture anyway. Lastly, the GIL can be easily bypassed by code that's written in C or C++, as long as it doesn't need to touch Python data structures. And for those rare occasions where multi-threading is necessary and multi-process is not, writing the threaded code in C or C++ will almost always be the better choice anyway.

Feel free to ask for clarification on stuff.

3ll3d00d 01-06-2017 10:28 AM

I had narrowed down to bokeh or plot.ly for a graph lib and had been leaning towards bokeh as the thing to try first. I will press on with that given your rec.

The only bit that needs a handoff is handling the data from the device as the bus is slow, the mpu has a small FIFO onboard (and the whole measurement is useless if it overflows) and the pi has pretty slow storage too. Current plan is just to do an async request to post the data out to the recorder (and then can coalesce those events if that backs up for some reason).

I plan to make a start on the analysis service this weekend so will hopefully have something basically useable some time early next week, depends how much time I get this weekend though.

3ll3d00d 01-14-2017 03:47 PM

I now have a basically functional pair of services, one that runs on the pi and takes measurements and the other which runs on another machine and receives/logs the data and will have the UI sitting on top of it. It's written to support 1 accelerometer per pi atm but as many pi's as you want so if you really want to load up your seating area with an array of pi's then go for it :) I will grab another pi soon so I can be sure this works.

Next step is to get a basic UI going so that I can at least trigger measurements & dump graphs on screen. I will need to add a bunch of error handling too as it's pretty brittle atm but will add that as I go.

I also need to add some instrumentation to the recorder so I can work out what sort of sample rate it can actually support and perhaps suggest a maximum achievable sample rate. 500Hz is certainly doable with accelerometer and gyro data at the default bus speed, 1kHz is not. If we can get away without gyro data (depends on the algorithm I end up using to strip out gravity) then that should bring 1kHz into range anyway.

3ll3d00d 01-19-2017 03:06 PM

1 Attachment(s)
I've now tidied up the backend some more & I put a bit of time into linking the project up with the various services (ci, coverage, quality) that you can pick up for free on github, this means I now have some shiny badges to display (the coverage one seems rather optimistic mind you but whatever)

https://www.avsforum.com/forum/attach...1&d=1484867005

I've also made a start on installation/setup docs - https://github.com/3ll3d00d/vibe#buildinstallation

I think the backend piece is at least a v0.1 now, it seems fairly reliable anyway. Next I need to add some automatically generated graphs after a measurement succeeds so there is some analysed data available. After that, I can either sort out packaging of the app(s) so people can install it or I crack on with the webapp. If anyone pipes up saying they want to use this then I'll sort out the packaging, otherwise I'll continue on my merry way with the webapp.

dominguez1 01-22-2017 06:53 PM

Keep up the great work 3!

3ll3d00d 02-03-2017 01:48 AM

rather slower going than expected on the web ui front, haven't really had the time available to get going on this. I have managed to get the basic infrastructure in place though so I now have a UI running that talks to the analysis process, it's been beaten with an ugly stick atm but CSS is not my strong point :) hopefully faster progress now over the next few weeks to get something usable out there.

3ll3d00d 02-03-2017 03:02 PM

1 Attachment(s)
quick glimpse of 1st screen, not v exciting but it is at least a screen that works :)

https://www.avsforum.com/forum/attach...1&d=1486162911

3ll3d00d 02-05-2017 02:05 AM

1 Attachment(s)
config screen complete & I have made a start on the screen that allows you to make measurements.

https://www.avsforum.com/forum/attach...1&d=1486289025

3ll3d00d 02-05-2017 12:55 PM

1 Attachment(s)
and the basic layout of the measurement screen, some bugs to iron out in how this talks to the backend but that's a job for another day....

https://www.avsforum.com/forum/attach...1&d=1486328056

3ll3d00d 02-12-2017 05:01 AM

1 Attachment(s)
I think it's now pretty much working for configuring and measuring, it renders the updates properly as it goes so you can see measurements progress properly and get some info back if something goes wrong.

https://www.avsforum.com/forum/attach...1&d=1486904475

I can now start on the analysis side of things.

derrickdj1 02-12-2017 07:25 AM

That looks fairly simple. Can't wait til you're done.:)

dominguez1 02-12-2017 07:31 AM

Looks awesome d00d!

3ll3d00d 02-18-2017 04:42 AM

1 Attachment(s)
first graph with controls with setting the ranges on each axis

https://www.avsforum.com/forum/attach...1&d=1487421686

3ll3d00d 02-18-2017 12:25 PM

1 Attachment(s)
it's not really designed for mobile (small screens are not ideal for graphs) but it's not hard to come up with vaguely sensible mobile layout so ....

https://www.avsforum.com/forum/attach...1&d=1487449492

3ll3d00d 02-18-2017 04:50 PM

1 Attachment(s)
most of the controls now on screen for the basic graph view along with a basic breadcrumb style navigation view (so you can swap between measurements/devices/graphs from one place)

nearly made it to a 0.1 build :)

https://www.avsforum.com/forum/attach...1&d=1487465383

3ll3d00d 02-19-2017 08:17 AM

a short demo, there are a few glitches in the UI as you'll see but you get the impression


3ll3d00d 02-25-2017 11:09 AM

1 Attachment(s)
this took considerably more effort than I was expecting to get right but I have now added the ability to compare an arbitrary number of data sets on one graph.

To illustrate you can see the same data selector along the top has added the ability to add/remove an analysis of a selected measurement. This means you could compare a specific axis across n measurements and/or devices or compare peak spectrum to average spectrum (i.e. the speclab view of the world) or basically any combination you want.

https://www.avsforum.com/forum/attach...1&d=1488049752

3ll3d00d 02-28-2017 04:34 AM

the pi zero has been upgraded with wireless connectivity - https://shop.pimoroni.com/products/raspberry-pi-zero-w

less than half the price of the pi3 and smaller/lighter, looks like it should be a good choice for making a small integrated measurement system and the cheapness means you could setup an array of them across your seating area without spending too much cash. You could probably even just forget about a case :)

dominguez1 02-28-2017 05:58 AM

Quote:

Originally Posted by 3ll3d00d (Post 51051249)
this took considerably more effort than I was expecting to get right but I have now added the ability to compare an arbitrary number of data sets on one graph.

To illustrate you can see the same data selector along the top has added the ability to add/remove an analysis of a selected measurement. This means you could compare a specific axis across n measurements and/or devices or compare peak spectrum to average spectrum (i.e. the speclab view of the world) or basically any combination you want.

https://www.avsforum.com/forum/attach...1&d=1488049752

This is great 3! :D

So the green line represents the peak reading for all axes? Can you also sum the axes?

dominguez1 02-28-2017 06:30 AM

Is the parts list updated for Post 1? Are you planning on posting any more detailed instructions on how to build, download software, etc.?

I'm happy to be an early adopter. :cool:

3ll3d00d 02-28-2017 08:25 AM

Quote:

Originally Posted by dominguez1 (Post 51119177)
This is great 3! :D

So the green line represents the peak reading for all axes? Can you also sum the axes?

they are the 3 lines listed above so they're all PSD values but for different measurements and axes. This isn't really a comparison you'd do in reality, just an illustration of what you can do. I haven't added sum yet, will do so soon.


Quote:

Originally Posted by dominguez1 (Post 51119785)
Is the parts list updated for Post 1? Are you planning on posting any more detailed instructions on how to build, download software, etc.?

I'm happy to be an early adopter. :cool:

I am working on packaging atm & am writing some installation docs to go with it, should be done soon I hope

3ll3d00d 03-01-2017 03:26 PM

@dominguez1 first cut at the rpi setup docs are up at http://vibe.readthedocs.io/en/latest/install.html#

3ll3d00d 03-03-2017 03:22 PM

first release is now published to pypi so testing is welcome

https://pypi.python.org/pypi/vibe-recorder/0.1.2
https://pypi.python.org/pypi/vibe-analyser/0.1.2

install docs -> http://vibe.readthedocs.io/en/latest/install.html

I don't have any windows instructions atm, I need to see if I can package it as an exe to make it a bit easier for people (who aren't familiar with this sort of setup)

I need to add some scripts for auto startup, cleanup some of the logging and fix a couple of minor ui glitches....

3ll3d00d 03-05-2017 03:52 AM

1 Attachment(s)
I've uploaded an early version of a windows exe for the analyser piece - https://drive.google.com/file/d/0Bxd...ew?usp=sharing

You should just be able to run it and you'll see a console window open saying something like

Code:

Loading config from C:\Users\Matt\.vibe\analyser.yml
Reactor analyser is starting
2017-03-05 11:49:12,505 - analyser.twisted - ERROR - __init__ - Serving ui from
C:\Users\Matt\AppData\Local\Temp\_MEI86682\ui\index.html and C:\Users\Matt\AppData\Local\Temp\_MEI86682\ui\static

if so then open a browser and go to http://localhost:8080, you should see something like

https://www.avsforum.com/forum/attach...1&d=1488714680

To exit just press control+c in the cmd window.

This was built and tested on Win8.1 Pro using http://www.pyinstaller.org , I have no idea what compatibility is like across windows versions (my windows boxes are all Win 8.1).

This build also has a few UI glitches in it that I need to sort out before formally releasing it but would be good if someone tries it to see if it works ok on their machine.

3ll3d00d 03-05-2017 02:38 PM

I think this is the first proper release -> https://github.com/3ll3d00d/vibe/releases/tag/0.2.0

windows exe can be downloaded via the link
installation instructions for the recorder are at http://vibe.readthedocs.io/en/latest/install.html

I think the faffing around with packaging is done now so I can get back to actually writing the thing, next step is providing access to the summed view and target curves.

dominguez1 03-05-2017 07:36 PM

Thanks 3! Appreciate all your work on this sir.

I'm a windows guy, so the EXE helps. I'm definitely going to do this, but just haven't figured out just quite when yet. Anxious to try!

Make sure and post in the Vibsensor Thread...

3ll3d00d 03-07-2017 03:32 AM

1 Attachment(s)
I wonder this would work

get a pi zero w
put it in a slim line case
use foam tape to firmly secure the breakout board to the case as per

https://www.avsforum.com/forum/attach...1&d=1488885858

solder wires directly to the headers on the board

the pi zero is 9g, a breakout board is about 5g, the case surely can't be very heavy... I reckon you could do this for £25-30 (no idea what prices are like in the US) so seems a good approach for a small, light, cheap, relatively easy to assemble sensor

3ll3d00d 03-07-2017 06:54 AM

https://www.adafruit.com/product/2883 is probably a good option for the locals

notnyt 03-07-2017 08:19 AM

Quote:

Originally Posted by 3ll3d00d (Post 51302233)
I wonder this would work

get a pi zero w
put it in a slim line case
use foam tape to firmly secure the breakout board to the case as per

https://www.avsforum.com/forum/attach...1&d=1488885858

solder wires directly to the headers on the board

the pi zero is 9g, a breakout board is about 5g, the case surely can't be very heavy... I reckon you could do this for £25-30 (no idea what prices are like in the US) so seems a good approach for a small, light, cheap, relatively easy to assemble sensor

pi zero are $5, pi zero wireless are $10. I think most places like microcenter and fry's carry em. Adafruit is pretty good though, also.

3ll3d00d 03-10-2017 10:59 AM

I think I'm going to adopt the ISO (1631?) reference acceleration and convert the graphs to dB, this places 0dB at 1 micro m/s^2 which means 140dB is 10m/s^2 (i.e. about 1G)

main reason is that it simplifies the plotting of a normalised curve (for technical reasons...)

dominguez1 03-10-2017 05:59 PM

Quote:

Originally Posted by 3ll3d00d (Post 51397225)
I think I'm going to adopt the ISO (1631?) reference acceleration and convert the graphs to dB, this places 0dB at 1 micro m/s^2 which means 140dB is 10m/s^2 (i.e. about 1G)

main reason is that it simplifies the plotting of a normalised curve (for technical reasons...)

Makes sense...get everything in db...easier to speak the same language across different metrics.

Do you have a link for this ISO reference?

derrickdj1 03-10-2017 11:24 PM

A small side note, no need to reference things to 140 db. That is way to high unless the reference used is less. Anything over 128 db. from the subs is clipping somewhere with movies and clean output was a goal of the thread. You can have a 100 subs but, the movies were only meant to be so loud and ear protection should not get loss in all of this. The goal is to keep our hearing and enjoy it.:)

3ll3d00d 03-10-2017 11:33 PM

Quote:

Originally Posted by dominguez1 (Post 51408273)
Makes sense...get everything in db...easier to speak the same language across different metrics.

Do you have a link for this ISO reference?

It is mentioned in http://www.acoustic-glossary.co.uk/d...m#acceleration and http://www.diracdelta.co.uk/science/...l#.WMOn6WmnxnE

If you Google acceleration reference db then you get lots of hits including that "measuring vibration" b&k pdf.

3ll3d00d 03-10-2017 11:36 PM

Quote:

Originally Posted by derrickdj1 (Post 51412681)
A small side note, no need to reference things to 140 db. That is way to high unless the reference used is less. Anything over 128 db. from the subs is clipping somewhere with movies and clean output was a goal of the thread. You can have a 100 subs but, the movies were only meant to be so loud and ear protection should not get loss in all of this. The goal is to keep our hearing and enjoy it.:)

SPL dB (which is referenced against 20 micro Pa) and La dB (which is referenced against 1 micro m/s2) are two completely different scales so 140 in one means nothing to the other. I'm sure we can (will) arrive at a view on what dB constitutes reference though.

derrickdj1 03-11-2017 01:49 AM

Quote:

Originally Posted by 3ll3d00d (Post 51412809)
SPL dB (which is referenced against 20 micro Pa) and La dB (which is referenced against 1 micro m/s2) are two completely different scales so 140 in one means nothing to the other. I'm sure we can (will) arrive at a view on what dB constitutes reference though.


Matt, your work for this ULF and Vibsensor thread is appreciated at the highest levels. I am just putting a caution for us to keep things real and relevant. We have to all admit, we are approaching levels found by many sources harmful to our hearing and need to keep things in perspective. I plan on doing dual UM18 boxes this summer but, it will be a lateral move since I don't need more spl. It no longers matters what increase in db. that the project may net. I hope this is not coming off wrong. I am all for people making a better system but, caution extreme spl for regular use.

Pesonally, I would love to hear how the perception is at 160 db clean. I can do the upper 130s but, the room can't take it for to long. So, I'm near the limit without having to do some major mods.

3ll3d00d 03-11-2017 04:36 AM

for sure, I don't go anywhere near those levels myself. Just remember it's not an arcade game, you don't need the highest score!

dominguez1 03-11-2017 04:46 AM

3, have you done a comparison of your vibe psd compared to the vibsensor psd?

I'd love to see how they compare with each other.

3ll3d00d 03-11-2017 08:10 AM

Quote:

Originally Posted by dominguez1 (Post 51414617)
3, have you done a comparison of your vibe psd compared to the vibsensor psd?

there is an example in this post, I'll repeat in more detail at some point.

seems like weighing down VS produces similar results <25-30Hz but IMV VS is not useful for >30Hz.

3ll3d00d 03-11-2017 01:29 PM

2 Attachment(s)
got a basic normalisation function going, for example a simple x y z view

https://www.avsforum.com/forum/attach...1&d=1489267633

and then with the z axis selected as the reference (so it becomes a flat line)

https://www.avsforum.com/forum/attach...1&d=1489267633

it's a bit limited atm as it is done client side and relies on measurements being taken at the same sample rate (will expand that later)

next step is to allow target curves to be set then you will be able to pick a target curve and use that as the reference curve


All times are GMT -7. The time now is 11:34 AM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.