Wednesday, 27 February 2013

Installing and First Steps with Ubuntu Studio and StealthPlug

Do you wanna get rocked?

A few weeks ago I finally managed to repartition a Windows Vista laptop so I could install Ubuntu Studio on it and try my hand at recording again, something I haven't done for quite some years. For those unaware, Ubuntu Studio is a real time version of Ubuntu (shocker) that comes bundled with a whole host of very useful music making and production software.

To install Ubuntu Studio, the first step (not surprisingly) was to download the ISO image (here is the download page). Now came the first wrinkle, which was that I seemed to have great trouble having the BIOS recognise large USB sticks. I would usually just dd the image to a stick and boot from it but this seemed to fail with the 3-4 sticks I had lying around. I therefore resorted to the tried and tested method and burnt a DVD. This worked without problems and I breezed through the install process (I seem to remember a question asking about including non-open source software which I said yes to. That was the only thing I had to actually think about).

So now I had a working install of Ubuntu Studio dual booting on my old Vista laptop. Next step: Getting my StealthPlug working on it. Now this is where the audio system in Ubuntu Studio requires a bit of explanation (and note that I'm by no means an expert here!). It seems the best method to use to give the smallest latency is Jack. This is very clever bit of software that 'registers' any inputs and outputs (both physical and software created) and allows you to link between any and all of these as you like (like putting jack leads between them, which I guess is where it gets it's name. Or that could just be a massive coincidence). As long as the Jack software is running you can hotplug these as much as you want.

So, how to get this to recognise the StealthPlug? Well plugging it in seemed to make it appear in both /dev and in the main UI. However, by default Jack runs with the main sound card and the inputs/outputs supplied don't show up. The secret of this is in the setup panel. So after starting Jack (Audio Production -> QJackCtl), go to setup and you should see something like this:


If you change the selected hardware device to the plugged in device (the arrow next to Interface will tell you which - /dev/hw1 for my Stealthplug for example) you should be away. To test it, fire up Guitarix (I rather cool open source amp simulator), select a sound (I went for HighGainSolo here) and then wire up the Jack controls something like the following:



and it should start making noise. Well I say that - make sure you're using the headphone output of the Stealthplug otherwise you won't hear anything!

So I now have a low latency monitor solution running with very little trouble. Next job: Direct output to the main card while still using the StealthPlug as input and add in MIDI and a USB mic as well.

Thursday, 7 February 2013

Modelling a Tyre and Rim in Blender

I've never been good at drawing - so now I get the computer to do it.

A few months ago, I stumbled across a website called Blender Guru. I've been tinkering with 3D for many years, going all the way back to using Imagine on the Amiga. I've never really got anywhere though, mostly due to a complete lack of artistic ability but also in part to the fact that 3D modelling is very difficult, the associated software quite pricey and there being a serious lack of tutorials on the subject (at the time anyway). In these enlightened times though, at least some of these problems have been solved thanks to the incredibly powerful (and more importantly, free) Blender, coupled with an internet's worth of tutorials, guides, etc. etc.

Now Blender is very good. Professional level good in fact. However, it also has one of the most unintuitive interfaces I've ever come across this side of a text adventure. On opening it up, there are buttons everywhere: most with cryptic names, some which produce more buttons and some that are hidden unless you know a particular incantation to reveal them. The learning curve can most accurately be described as a step function. Couple this with 3D modelling still being quite a tricky thing to do and I thought it would be another thing I'd play with for a bit and then bounce off.

And then I found Blender Guru. This site contains many cool things but mostly, it contains very detailed and easy to follow video tutorials (some just text) that take you through all sorts of aspects of modelling. It not only explains what and why you're doing things as regards the modelling, it also manages to get you through the crazy interface to the point where it even starts to make sense.

Currently, I've only followed one tutorial through to completion (this one), but here are the results:



It's not perfect and I still need to work doing the lighting (I'm hoping another tutorial will help me on that) but it's a start :)

One point to note: I modelled the rim and tyre separately but, when combining them, I really wanted to try 'linking' the tyre object into the rim scene so I could alter the tyre in the other file and the changes would be apparent in the rim file. This was not as trivial as I thought because, just selecting 'Link' from the File menu, navigating the Blend file to the parent Tyre Object and selecting this, but the object in but fixed the position and rotation. This wasn't what I was looking for. A swift prayer to the Google god showed me that you actually need to Group the to-be-linked object in it's file and link to that rather than the object itself. This worked like a charm.

As soon as I've set up a git repository, I'll upload all .blend files and supporting files. Next on the tutorial list is an 'Introduction To Texture Nodes'. Sounds swish.



Monday, 4 February 2013

Controlling Lego Mindstorms with Linux

In another 200 years I might have built the terminator

Last Christmas, I was the very lucky recipient of the incredibly awesome Lego Mindstorms kit. Here's a picture of all that awesome:


Yes indeed - computer controlled Lego. If I'd got this 20 years ago I wouldn't have seen daylight until I had to leave home.

Now the way this works is that there is a microprocessor controller brick that can have up to 3 motors and 4 sensors connected to it. In theory you build your robot (or whatever) using the included Lego (and any other bits you have lying around), design your program for it using the LabView based language included, download it to the control brick and away you go.

Now this is all well and good and gives you quite a bit of control. Here's a case in point:


However, though I appreciate the benefits of Labview, I'm more of a C++ kind of guy. I also have a long term plan of using another of my presents this year, a Raspberry Pi, as the main controller and maybe throw in an Arduino as well for a bit more flexibility.

This will therefore necessitate an API interface to the controller. A quick bit of googlage pointed me at a promising looking Python based version: NXT-python. This not only allowed all the file access and compilation options I could want, but also (and this was the important bit) had a direct, real time control option. What was even better was that in my Mint install had in the software manager (search for 'nxt'). A couple of clicks later and it was ready to try out. Awesome.

Or not. The version in the repo is a bit behind the main release (V.2.2.1-2 instead of V2.2.2) and contains a rather critical Ultrasonic sensor bug. However, I was still able to plug the brick in via USB (after building the basic tracked vehicle in the instructions), turn it on, and use the following code to get it move rather drunkenly around:

 
import nxt.locator
from nxt.motor import *

def spin_around(b):
    m_left = Motor(b, PORT_B)
    m_left.turn(400, 360)
    m_right = Motor(b, PORT_C)
    m_right.turn(-400, 360)

b = nxt.locator.find_one_brick()
spin_around(b)


Obviously, this requires you to plug motors into ports B and C :)

This code was shamelessly nicked from the examples that came with the nxt-python install and can (probably) be found here:

/usr/share/doc/python-nxt/examples/


These contain code for using the speaker and reading the sensors, the latter of which required a bit of hacking to fix for the ultrasonic one. If you run it as is, you get the error:

    sensor = Ultrasonic( BRICK, inPort)
  File "nxt-my\nxt\sensor\generic.py", line 95, in __init__
    super(Ultrasonic, self).__init__(brick, port, check_compatible)
  File "nxt-my\nxt\sensor\digital.py", line 73, in __init__
    sensor = self.get_sensor_info()
  File "nxt-my\nxt\sensor\digital.py", line 156, in get_sensor_info
    version = self.read_value('version')[0].split('\0')[0]
  File "nxt-my\nxt\sensor\digital.py", line 143, in read_value
    raise I2CError, "read_value timeout"

As this it's basically saying, there is a timeout issue when reading the ultrasonic sensor. Again, google came to my rescue and pointed me Here. After doing the correction suggested (i.e. increasing the loop count up to 30 on line 84), all was right with the world.

So I now have a computer controlled robot (sort of) that can be told what to do through python. This is certainly a start but if I'm going to control it with the kind of code I have in mind, I'm going to need something a bit more heavy duty. Next job: running python from C++.