Suffrage Science – decoding the brooch

A week ago I went to London to borrow a piece of scientific jewelery for a couple of years. It’s a delightful, rather bonkers scheme by the MRC called Suffrage Science, whereby they chose 6 women computer scientists to receive a brooch 2 years ago, and last week they handed the brooch onto the next woman. In just under two years time I get to hand it on to the next person, and that way it passes from scientist to scientist. I was given the brooch by the excellent Professor Carron Shankland from the University of Stirling.

The event was good fun – here’s me having received the award, and having had some nice free wine…

and here’s the group picture of all the donators and the recipients – they did Mathematics and Computer Science in the same event so there were about 12 of us being passed jewelery by another 12. Carron’s the one hiding in the back with the yellow top.

Enough about the socialising though. What about the brooch?

It was designed by Veronika Fabian who was at the time a student at Central St Martin’s – she won a competition with the design. And it’s lovely.

It’s a model of a piece of curled punched tape, with a 5 bit encoding. Each row of holes or spaces (two positions for data, one smaller hole, then three positions of data) encodes a binary pattern, which represents either a letter, a number, or a control sequence. These were used in early telecoms and computing to transfer and store information. Three of the holes contain small jewels in the Suffragette colours – purple for loyalty, white for purity and green for hope. It turns out that if you put a piece of paper behind the holes, you can just about make out the pattern. So I spent a little time this afternoon doing just that.

With the pattern transcribed to a piece of paper, I made a guess at the character encoding (I went for ITA2 as it seems the most common and was in widespread use in early computing). ITA2 uses short codes to represent common letters and the more complex four dot codes to represent rarer letters. I started by filling in all the E characters (dot in position 1, all others empty, so 10000) and spaces (dot in position 3 – 00100). I thought this looked a bit odd as there weren’t many Es in the message, but it turns out it was right. When I filled in more letters it became clear that certain suffragette statements started emerging. The complete message is “DEEDS NOT WORDS COURAGE CONSTANCY SUCCESS THROUGH THICK : THIN WE N’ER GIVE IN SUFFRAGE SCIENCE AWARD 2016 DEEDS NOT WORDS COURAGE CONSTANCY S”. To get numbers and punctuation, there’s a shift signal – 11011 and there’s a shift signal 11111 to go back from numbers to letters.

A picture of my decoding is below – click for a larger version. There’s one character missing and it’s in a tricky part of the brooch to see, so I think the error is probably mine not the designer’s. The missing character should be 11111 after 2016 to go back to letters for the final “DEEDS NOT WORDS”.

The image below shows the relationship of the words to the brooch. When you’re wearing it the things that can be seen are “DEEDS NOT WORDS” on the upper, small loop, and across the back of the brooch in the uncurled part you can see most of “THROUGH THICK AND THIN WE N’ER GIVE IN”. On the outside of the lower, larger loop there’s “SUFFRAGE SCIENCE AWARD”.

Null sector installation – EMF2018

For EMF2018 (my general blog post about the festival can be found here) Charles Yarnold and my old friend Ben Blundell built a cyberpunk zone, called Null Sector, with installations and all sorts of cool stuff.

I made a tiny part of this, in the form of a surveillance themed installation which sat behind the cyberpunk-style grill in the bar area.

The aim of the installation was to provide a slightly disconcerting surveillance-style view of the people in the bar, matching the general branding of Null Sector, so it seemed as if the company running Null Sector (Polybius Biotech) were carrying out videosurveillance of attendees. There was no real videosurveillance involved, just a bit of raspberry pi/OpenCV smoke and mirrors.

Polybius biotech logo by Ben, who has real talent at this graphic design stuff

The Polybius Biotech brand has quite clear graphic style and a defined palette of colours – orange, magenta, blue and black (mostly). So my aim was to do a bit of face detection and a bit of live video enhancement, cast to this colour scheme, and display a constantly updating video feed.

The way I did this was to have a handful of different background styles (solid black, edge enhanced, edges only, greyscale image, sepia image, colour flipped image, colour image) and for each frame grabbed by the camera, the system did some face detection, and applied one of the background styles to the main image and one to the face regions. The system then drew boxes around the faces in Polybius Orange or Polybius Blue.

Which colour schemes it used was a random choice and stuck for a random number of frames. There were 14 different foreground & face combinations, and each one would be kept for a random number of frames between 1 and 100 – this made watching the screens more interesting as you could never know how long a particular view would go on for, and the views looked pretty different.

It went quite well but was not without hiccups. There were two things I could have done to make it better:

  • Test the system in lower light conditions – when it got crowded in the shipping container, the light getting to the cameras wasn’t that great and so sometimes it just didn’t find many faces
  • Optimise the code. I was running the system as fast as it would go, which gave about 3 frames per second. Raspberry Pis are great fun but not massively fast when it comes to live video manipulation. This was fine as that gave it a cool kinda laggy effect. But the unintended consequence of running the Pis as fast as I could was that they overheated pretty quickly. I think I could have gotten around this with a bit more forethought and a bit more thought about the code

At this point in a blog post I normally say “And here’s the code!”, but to be honest I’m a bit embarrassed about the code quality here – it’s 600 lines of unoptimised c++ all in one file and full of stupid errors. I’ll tidy it up, maybe, and use it for something else. However I will give a bit of a deep dive into a few of the gory technical details.

GORY TECHNICAL DETAILS

  1. The face detector was a bog-standard Viola-Jones style cascade classifier, from OpenCV’s standard libraries
  2. The edge detector was even more old fashioned, as it used Canny which was published in 1986. Works though.
  3. The input came from a Raspberry Pi camera module, which is a cheap way to get pictures into a Pi, and it has a very small form factor (if you look at the image 2 pictures up you can see it – it’s a little green square on the bottom left of the monitor).
  4. My monitors were borrowed from Aberystwyth Computer Science and came from a decommissioned computer lab – they were old Sun Microsystems flat screen jobs. These had DVI & VGA inputs but the Pis only have HDMI outputs, so I had to get some adaptors. Turned out the adaptors I bought from ebay were crap and so I had to find some new cables with a day to go – fortunately, hacker camps are good places to find random cabling.
  5. OpenCV has some graphics output functionality but it’s a bit clunky. As my monitors were all identical I was able to sidestep the OpenCV graphics, and write directly to the framebuffer. (That is: low-level graphics can be hacked about by accessing the bit of memory which corresponds to the monitor, then writing directly to that memory – this is called the framebuffer). There’s a great tutorial about doing this stuff on a Pi at Raspberry Compote, but in short, you need to…
    1. Find out the size of your monitor
    2. Find out whether the monitor is RGBA or BGRA or RGB or BGR
    3. Set up a pointer to the bit of memory which makes up the monitor input
    4. Convert your image to the right format (e.g. RGBA in my case)
    5. Splat the image onto the right bit of memory and Hurrah! The display will change
  6. Finally I set the machines so that they had a fairly cyberpunky desktop background (“The sky above the port was the color of television, tuned to a dead channel”) and to load the video program upon boot. This meant I just had to plug it all together then plug it in and it would all just work.

In all – it was fun, it nearly worked well, people seemed to like it, and I learned a lot about building things for long term installation in a public place. Next time will work better.

Electromagnetic Field 2018

Electromagnetic Field is a massively friendly not-for-profit hacker and maker camp which happens every two years. I went in 2012 and spoke about women in tech, and I went again in 2016 and spoke about doing robotics with kids. This year I’ve been trying to do a bit less work and get a bit less stressed, so I decided not to submit a talk or workshop. Then my mate Ben put out a call for installations for a cyberpunk zone and I ended up pitching an idea for a display to sit behind the bar. This installation took – as you might imagine – longer than anticipated to pull together.

After a late-evening arrival on the Thursday and a night spent in a tent at 7° with a very thin sleeping bag, I was not 100% ready for festival land. Turns out that being sober and cold in a field full of people and ducks makes it hard to get to sleep. My first action (post coffee) on the Friday morning involved asking a friend to pull into Argos on their way in, so I could have a bit more padding and insulation.

On the Friday I spent most of the day putting together my installation in the Null Sector cyberpunk zone, and generally helping out in Null Sector with carrying, lifting and general tidying up. This was a new element of the festival, made up of a bunch of shipping containers with installations, robots, flame throwers and other cyberpunky goodness. This provided a DJ area and an additional bar to the festival, as well as a night market (night one) and an electronics exchange (night two) and a generally cool place to hang out (all nights). My installation was in the bar, which opened a little later than anticipated.

me standing at the door of the bar saying to all who pass “This will be a bar soon, but we’re still trying to fix the ethernet to the till”.

I’m not going to write anything about my installation in this post as I think it deserves a bit more of an in-depth description in a post of its own. After dark, with lasers and flamethrowers and so on, Null Sector really looked impressive.

Dancers in the Null Sector lit up with lasers, flamethrowers and EL wire.

Other than the prep and debugging in Null Sector I didn’t see much of the main festival on the Friday. I did do a couple of talks though, and the first talk I went to was also one of my favourites of the whole weekend. It had all the ingredients of a fine EMFCamp presentation. Cute robotic elements, anecdotes, how-to details, tales of unsuccessful iterations, and a live demo which pretty-much worked but was at times baffling. You can watch the talk here:

Libby Miller talking about building a telepresence robot out of an Ikea lamp – github link

Throughout the weekend the Hacky Racers were running, a bunch of modified small electric vehicles (at least I think they were all electric…). These raced around a straw bale track – I think a lot of different people were involved in driving each one. They had enduro races, time trials and all sorts of contests. Surprisingly fast at times, and very entertaining. It would have been easy to spend an afternoon just watching, if there weren’t a million and one other things to look at and play with.

One of the Saturday workshops I did was a soft circuits workshop with the excellent Helen Leigh, author of the soon to be published book “The Crafty Kid’s Guide to DIY Electronics“. The workshop was a build from the book, making a light-up emoji sparkle heart. I didn’t get my official EMFCamp badge to work so spent most of the rest of the festival wearing this lightup sparkle heart instead of a badge. I’ve been following Helen on Twitter since the last EMF and I can’t wait to read her book and buy it for all my age-appropriate relatives (and probably myself, first).

Helen and I with my sparkly heart emoji

On the Saturday night there was a screening of the film Hackers, which I had never seen before; I think that means I’m probably not as much of a geek as I thought I was. Seen it now though. Great film. A++++ will watch again. HACK THE PLANET.

Every year I’ve been to EMF I’ve just missed out on the Titanium Spork workshop by Richard Sewell. This year I made it – I got there half an hour early and was the last person to get in. Then I spent 2.5 hours drawing, thinking, cutting in cardboard, cutting in titanium, then a whole load of hammering. At the end, I had my own piece of cutlery. SPORK. Since I got back I have said “Do you want to see my spork?” to about 20 people.

SPORK and @jarkman

On the final night I spent much of the time in Null Sector dancing to the DJ’ing of Chemical Adam – he did another of my favourite talks from the weekend, on somatosensory music, body hacking and beyond. I thought – based on the sheer nuts value of his talk – that the DJ set would be likely to be pretty mad too. Spoiler alert, it was good but not completely bonkers. Take that as you will. Lots of us danced to it.

After the music stopped – bang on 11, as that’s when the licence said it had to, I hung around playing embarrassingly anatomically dynamic computer games with a bunch of Aberystwyth graduates, which was a really pleasant way to round off a lovely evening and a great festival. I’m already looking forward to 2020.

Reflections on 6 months of part time

For the last 6 months I’ve been part time at 70%, in pursuit of a bit of headspace and some work-life balance. This was part of Aberystwyth University’s “flexible working” scheme where you can apply for different hours for 6 months on a trial basis, so it was a relatively risk free way of experimenting with a little bit more free time.

Obviously, it’s not always possible to get work done in the time available (and some things – like open days or travel – I didn’t count because nobody bothers counting them). Generally I’ve tried to stick to my days off though, keeping track of my hours, and here I am at the end of the period owed just 4 days. This is a level of slippage I can deal with, and I’ll take those days here and there before term starts.

Did I get everything done at work? Well I’ve had to be a little less perfectionist than usual. I’ve tried to manage everything WRT teaching, admin and research, and my workload has been reduced a bit (no tutees). I do feel like I’m missing stuff and I am no longer on top of everything that goes on in the department, but that’s OK and also part of the point of being part time. I don’t actually have to do everything myself.

So that I didn’t let myself sit indoors messing about on the Internet for a day and a half every week, I set myself some “part time challenges”, because nothing says work-life-balance like a to-do list. I’ve been pretty good at this. On the list were some physical things – walk up a big hill (Cadair Idris), get fit enough to cycle home from town. I had to cycle up the back way (still not fit enough to cycle up Penglais Hill) but I have made it home from town on my bike.

Others were dorky, and these have already been reported on this blog as I built a retro games controller and a drum kit ball pit.

The remaining things on the list were creative; I wanted to paint a picture using watercolours as I’d never used them before, and a picture using acrylics because I’m out of practice. I also decided to teach myself Blender, which is a 3D modelling system. I’ve managed all of these except for Blender, and I think I’m going to give up on that. It’s a fiddly piece of software and I’m happier writing code than digging about in menus so if I get into computer graphics I’ll do it at a lower computational level.

Here’s one of the watercolours, which is a poster of things around Exmouth that my sister in law might like:

Here’s the acrylic which is as usual of Penglais woods:

The problem with that woodland picture is that the patch of pale pathway looks a bit like a ghostly sheep from a distance. I’m going to have to fix that.

I’ve found it quite enjoyable having the time to do things that aren’t work, and to do some dorky “almost work” things which I would otherwise not have had time for. So I’ve gone part time as a permanent measure. This means that in the future this blog’s readers (both of them) can expect more crap watercolours and daft electronics, and fewer conference reports.

Inventeurs meeting, London

A couple of weeks back I went to London to observe an EU project meeting as external evaluator. The project is a direct descendant of the Playful Coding project and has some of the same partners, so it’s good to see what they’re getting up to after that project ended. Inventeurs is a project which looks at transnational collaboration on coding activities, particularly to support migrant children. The UK partner this time is London South Bank University (LSBU) who I have done quite a bit of work with in the past.

LSBU are based at Elephant and Castle which is pretty near where I grew up – we started our meeting at their campus and discussed progress on the work. As part of the project the partners are creating a MOOC (Massive Open Online Course) to get teachers involved in the project, covering the pedagogical aspects and the technical aspects. It was really heartening to see the progress being made on this; it will be a very useful resource for teachers getting into coding and collaboration.

After lunch, we walked to a school in Walworth, which took the group down East Street Market and Walworth Lane, both roads I remember clearly from my early years in London. I suspect I was the only person in the group feeling heavily nostalgic, everyone else was simply captivated by the colours and the sights. It’s not tourist London.

In the first school we took over a business studies classroom and talked about management of the project; the classroom decoration was apt for Mireia’s talk!

At the end of the school day we travelled further south and had an evening session in Peckham Library, looking at collaboration between classes and how we can get schools in different countries to work together on an extended topic over 10 weeks. There are some difficult hurdles to jump here – some of them are to do with prior knowledge and pedagogical issues but I suspect the biggest hurdle might turn out to be term dates.

At 7 we left the meeting and headed to the dinner location (missing the first goal in the England-Croatia game…). Moving around London by tube during a heatwave is not that fun.

The following day we started early at Southbank Engineering UTC, a school in Brixton. We were at this school for the whole day and met some pupils who’d been taking part in the project. The school’s emphasis on engineering and technology was really cool, the walls were papered with fascinating posters and the students we spoke with seemed very engaged.

School dinners are school dinners though. This photo captures the dining hall before the kids showed up (I’m not going to post photos of random schoolkids on the blog). Having spent quite a lot of my life in south London comprehensives, the atmosphere during school dinner was familiar and I have to say not entirely comfortable.

In the afternoon we spent most of the time discussing project plans and how things are going to work for the final period of the project. In September, the project opens up to other schools and becomes open for anyone to join so there are a lot of things to get right (training, connections, the organisation of school partnerships). The idea is that two classrooms in different countries will work together on a 10-week project, devoting a couple of hours a week to it, building up a collaborative animated story online using scratch. It’s a great, ambitious project, involving tech, art, storytelling, transnational collaboration, and themes of social justice.

There was a lot of work done that afternoon, but we did pause for a game I call “Europeans trying Marmite”.

Women in Tech Cymru summer conference

On July 7th Aberystwyth University hosted the first summer conference for Women in Tech Cymru, a new group which looks to support and network women working in tech in Wales. We were lucky to get the Vice Chancellor of Aberystwyth University to come down and open the event and welcome everyone to the uni, and then we had a keynote from Phillipa Davies. I’m afraid I can’t say much about the keynote as I was out on the registration desk for that slot though.

The event was an “Unconference“, which means that largely speaking the attendees made up the conference on the fly by pitching sessions and then breaking out into small groups to discuss things. Here’s a photo of some people unconferencing.

I’ve set up a Google drive folder for any slides or materials and will add them as I get them

Some notes on refreshment and childcare

I am a strong believer that if you’re running an event on a Saturday that’s not for kids, you should put on something for the kids or at least offer childcare. It’s not a problem I’ve ever had to solve myself but I have enough friends and family who’ve managed to reproduce to understand that this is the big sticking point when it comes to attending weekend stuff, particularly although obviously not exclusively for women. Thanks to BCS Mid Wales we were able to employ 2 people from the award-winning Aber Robotics Club to run a lego robots workshop with 6-16 year olds. They built LEGO robots from scratch, raced them remote controlled by Android tablets, modified them for various purposes (speed, agility, strength) and then had a robo-sumo contest against our house robots. This was a major hit – one mum had difficulty getting her offspring to leave.

We wanted to have an event that was free for attendees, but also wanted to raise some money for running costs. So we sold tickets as “supporter tickets” (£15) or “big supporter tickets” (£30), with free entry available for anyone; a sort of pay-as-you-feel conference. I really liked this idea and it’s one I’ll use again as it means people don’t have to commit money up front if they’re not sure, but it’s also easy for people to tangibly help the event. There was no difference at all between supporter and free attendees at the event, so no kudos (or stigma) associates with either decision. In the end, about half the lunch cost came from sponsorship and half from ticket sales which is great.

Jamie McCallion of 13Fields Ltd kindly contributed sponsorship towards lunch, and BCSWomen sponsored coffee breaks, so we were sufficiently caffeinated and fed. We even managed to rustle up funds for 70 Welshcakes in the afternoon break:-)

The tech stream

I had never been to an unconference before and thought it might be useful to have a bit of a more traditional conference element too – it let us have more detail in the programme for starters, which I think helps people decide whether or not to attend. As Aber were hosting it I twisted the arms of 3 superb colleagues to run 30 minute introductory tech talks to hot topics. First up was Azam Hamidinekoo, with “Why you should be interested in deep learning and how you can get started” which provided an overview of some recent exciting developments in neural networks. This was a very clear talk and I’ve had some great feedback from attendees on this.

Next up, just after lunch, was Amanda Clare talking about word2vec, and ways in which the vectorisation of word semantics can be useful in working out the semantic relationships between concepts, but can also be problematic in terms of bias (the associations with feminine terms are often worse than those with masculine terms for example).

And our final tech talker was Edel Sherratt, talking about the Internet of Things, and how we probably want to start looking at formal models and specifications for this particularly when we consider security. Edel has done a lot of work on standards in other contexts and it was really interesting to see this stuff being put forward in a different context. This work could really have impact if it takes off!

At the end of the day there was a panel session, followed by a group photo and a bunch of people drinking lemonade on the pier and admiring Aberystwyth. The photo is below – it captures less than half the attendees as a lot of people left at 3 (some football match or other)…

All I want for Christmas is a drum kit ballpit

At EMFCamp in 2016 there was a musical ballpit, and ever since having a go in it (which was my first ever ballpit experience) I have wanted to build one.

This has become something of an ongoing project. First, I contacted some soft-play suppliers to find out about the cost of balls, and it turns out that soft-play balls (proper ones for use in commercial soft-play facilities) cost quite a lot. 15p+VAT each. Which doesn’t sound like a lot, but to fill a large paddling pool that was going to be something like £2,000, and even if I tried to bodge it with a smaller paddling pool it was going to be really very expensive indeed.

So plan B was formed. I figured that 5p a ball was OK, and I wasn’t bothered with quality. So for the last 18 months or so, every time I’ve seen kids play balls on special offer (in the supermarket, in B&M, in Lidl…) I’ve bought them as long as they’re £5 or less for 100. They pile up. Slowly, but they pile up.

There’s more to the problem of building a drum-kit-ballpit than just having a lot of balls though. You need to make the balls do something – in particular, motion in the ballpit needs to trigger sounds. I did this through a mixture of OpenCV and pygame, in Python, and if you’re interested you can see my hacky code here: on github. Basically the program runs the webcam and plays a drum sound if it sees motion in specified parts of the image. The readme on that project describes in more depth how the code works, if you’re interested in the details. Point the webcam at a ballpit, set the program running, bang the drums.

At the BCS Mid Wales show and tell last week I put the components together – paddling pool, balls, tripod, and motion detection. I hadn’t actually tested it all together before the evening, although I had got Helen to agree to sit in it. As you can probably guess from this slightly giggly video (thanks Colin):

Ballpits are fun.

Building a retro-games controller

I decided to build a retro games controller based on something I saw on the internet. There are lots of discussions and videos and howtos, but to be honest I’ve never been particularly good at following instructions so I just bought a kit from arcade world (the two-player xin-mo board one – here’s a link) and had a go at bodging it together.

It came with some instructions. Here’s a picture of the instructions, along with a pound coin for scale. I did read these instructions. Then I googled, to find slightly more detailed instructions. Then I went “fuck it” and just got on with the job.

Prototype #1 had 6 buttons per player (with select and start, too). It was based on a plank we found in our garage when we moved in.

The buttons look pretty cool when they’re attached. Nice bright colours. Tidy.

Then I wired it up. The kit came with “easy” clip-together wires which had a jumper at the board end and a metal clip at the microswitch end. I can tell you now that it is remarkably easy to wire a joystick up incorrectly – upside-down is my most common error.

However, this wiring and board had a lot of good properties for a prototype.

  • If I plugged it in to my laptop (running linux, of course), and ran jstest on /dev/js0, and pressed the buttons, they all sent some kind of signal
  • If I plugged it into a Raspberry Pi running Retropie it identified as a controller
  • Plugged into the laptop it could control, badly, a driving game. Accelerate and reverse were unintuitive, but left and right worked
  • The joysticks might have been upside down but they felt really cool to play

There were also a couple of negative aspects, though.

  • I couldn’t get retropie to calibrate the thing as it was expecting more buttons on a xin-mo (I thought)
  • The left and the right joystick both controlled the same character so 2-player didn’t really work
  • The left and right controls were too close together anyway for comfortable play

So I removed the components from the board, smashed it up with a hammer, and put it in the woodpile. Time for a new plank and a new start with a little more understanding. Not too much understanding though. There were a couple of big mistakes left to go.

Removing the components from the board is non-trivial as the metal clip on attachments don’t come off easily – indeed, once they’ve clipped on they’re not supposed to come off at all. So a small screwdriver and a bit of leverage was required every time I wanted to change the wiring. For example, to remove the components from the old board, and then to make sure the joysticks were wired the right way up. Again. Prizing the connectors open was fiddly and slippery and generally not straightforward.

Of course having bent the metal clip open they then started falling off all over the place during testing. So then I had to squeeze them all shut again so that I wasn’t debugging software and hardware at the same time.

Anyway I got it all together on the new board with each button attached to the xin-mo interface on the right pin (eventually). Tested it on the laptop – worked OK. Tested it on the retropie set up and I could navigate menus, select stuff, and even start a game. There is a modification you have to make to some config files to get the controller to identify two joysticks, so I did have to edit some stuff (instructions here: on the retropie site). Getting closer.

Still one big problem though. Some of the buttons were behaving really strangely. Back to jstest to see what’s what…

What jstest gives you is a big table of inputs and when you touch the controls, it shows you what’s coming in for each input. The snippet below shows me moving the joystick (Axes up and down), and pressing button 0, 1, 2, and 3 in turn.

Axes:  0:     0  1:-32767 Buttons:  0:on   1:off  2:on   3:off   ... 
Axes:  0:     0  1:     0 Buttons:  0:on   1:off  2:on   3:off   ... 
Axes:  0:     0  1: 32767 Buttons:  0:on   1:off  2:on   3:off   ... 
Axes:  0:     0  1:     0 Buttons:  0:off  1:off  2:on   3:off   ... 
Axes:  0:     0  1:     0 Buttons:  0:on   1:on   2:on   3:off   ... 
Axes:  0:     0  1:     0 Buttons:  0:on   1:off  2:off  3:off   ... 
Axes:  0:     0  1:     0 Buttons:  0:on   1:off  2:on   3:on    ... 

Now the more observant reader will notice that some of these switches default to on, and others default to off. Whoops. Turns out that microswitches have three prongs to attach wires to and it actually matters which ones you choose. Who knew? On the side of the switch, top prong good, bottom prong bad.

Finally there were a couple of buttons which didn’t have leads (maybe I lost a piece of wire? or maybe they were spare buttons?). I soldered those in with a spare cable I stole from Tom of Sam and Tom industries. Doesn’t everyone do their soldering on the cooker?

Finally tested, I put a base plank on so that you can have the controller on your lap on the sofa as well as on a surface. This is the finished product:

And this is Sonic the Hedgehog II on my nice big telly.

BCSWomen Lovelace Colloquium 2018

The 11th BCSWomen Lovelace Colloquium was held just before Easter, at the University of Sheffield with support from Sheffield Hallam University. Regular followers of this blog will know that the day has a well-defined format, with student posters, speakers, a panel on computing careers, and a social at the end of the day. We also have a cake sponsor, so we also have too much cake.

cake

This was the first colloquium since I started the conference in 2008 where I wasn’t in some sense the conference chair. Cardiff, in 2010, was chaired by Miki Burgess (and I didn’t even make it to the event), but I was still involved in the financial side and the lineup. This year, I handed over properly to Helen Miles and was officially just deputy chair. We had two excellent local chairs in Heidi Christensen (tUoS) and Deborah Adshed (SHU), and our long-term supporter Amanda Clare came to lend a hand too. But Helen was in charge.

Any worries I might have had about the running of the event evaporated on arrival. Due to some bad fog in Aberystwyth I set off much later than anticipated (not wanting to drive with zero visibility), and then we got lost in Stockport just for fun, so I turned up about 4 hours after I had planned to arrive.

On the one hand, whoops!… on the other hand, Heidi, Amanda, Helen and company had managed to do pretty much all the pre-event stuff without me. Magic.

On the day it was actually huge fun helping out without feeling I had to go to every session. So during the afternoon talks I helped tidy up and made sure the sponsors and stallholders knew what was going on, and collected poster contest judging results etc. which helps the day go more smoothly and left Helen to be the figurehead.

Indeed this was, I think, the fourth Lovelace where Helen, Amanda and myself have been on the helper staff together. Between us we are quite good at working out what needs to be done and just getting on with it. To be honest, seeing Helen in the chair was just brilliant. Partly because she did such a great job, and partly because for the first time ever there was someone at the Lovelace more stressed than me (sorry Helen:-).

Here’s me and Helen at the evening social – relaxing at last.

me and helen

Robot club coding project

In Aberystwyth Robotics Club we run after school robot-based activities for local schoolkids. This year we’ve just started a new term, and we decided to take in twice as many new starters as before. We decided it’d be a good idea to try being more open and let more kids have a go; it doesn’t matter if they haven’t been keen enough in the past to try out computing or robotics, what matters is they’re interested now and they want to give it a go. If they drop out at Christmas, that doesn’t matter, they’ve given it a shot.

So we are running two parallel sessions for the complete beginners. Half of them are doing a new “coding taster project” and half are doing the normal robot club starter, which is soldering, circuits, arduino basics and a bit of lego robots, and then we swap. I’ve been running the coding half, and we’ve been hacking around with face controlled games. The rest of this blog post is about the coding stuff – and what it’s like doing computer vision based real-time coding with a bunch of 11-12 year olds.

If you’re just here for the resources to try this yourself they’re linked at the bottom of this post.

What did we do?

I came up with a four week programme of sessions (2h per session) which use the computer vision library OpenCV to animate, read the camera, detect things in the camera and then make a game. Broadly speaking the plan is as follows…

Week 1: animation

The first week was all about simple graphics. Drawing dots, squares and lines on a canvas, having a loop and changing things in the loop, thinking about colours and R,G,B space.

Week 2: video and faces

Next up we looked at video and faces. This involved thinking of a video as a set of still images (get an image from the camera, put it in the window, repeat). Then we tried running the face detector, and working out what the face detector can do. This session was a bit short – we spent some time changing sensitivity settings, and min/max face size settings, and trying to fool the face detector by drawing pictures of faces. I’ve beefed this session up in the worksheets so hopefully next time round (in 2 weeks time) it won’t be short.

Week 3: collision detection (when does the ball hit the face?)

This session involved bringing together week 2 (rectangles around faces) and week 1 (balls moving on the screen) and working out when the rectangle and the ball overlap. This is harder than it sounds.

Week 4: game time

Now we introduce classic game elements, including scores, levels, lives and so on. There was also a lot of changing colours, and using graphics to overwrite the face area (e.g. a smiley face). Customisation was a very popular activity – no two games looked alike.

How did it go?

At the end, everyone had a game working, and a lot of the kids wanted to spend longer on their project. This was not all their own work – we have a lot of volunteers so I think the ratio was about 1:3 helper-to-student, and the project is really ambitious. But I do think all the kids have a fairly solid understanding of how their game fits together, even if they didn’t write all the code themselves.

The code was quite hacky and the entire project was based around getting something working quick rather than coding properly. A lot of them had tried Scratch before, but I think only one or two had looked at any textual programming language, and Python is quite a leap from Scratch. I was keen to keep it fun, so I provided them a lot of code to edit (each week had a starting code file which they had to develop). In this way they got things on the screen and spent their time doing animation rather than doing exercises. Robot club is after all supposed to be fun…

I think that all the kids have learned a few fairly key concepts associated with coding: case sensitivity, indentation, if statements, debugging, comments, the idea of iteration, the idea of a variable, and the general way coding best proceeds by small changes and lots of testing. I don’t think any of them could take an empty file and write a python program, but they could probably look at a simple program with conditionals and so on and have a guess at what it does, and maybe edit some aspects.

In terms of higher level concepts, they have a good idea about what a face detector is, and how it works, they have a grasp of image-based coordinate systems, maths and inequalities in 2d space (collision detection), the idea of a colour being made up of R, G and B, and the idea of a video being a set of images presented quickly in succession.

They’re also all able to tell the difference between different quote marks, and can find the colon symbol on a keyboard, although I die a little every time I hear myself refer to “#” as “hashtag”.

Materials

If anyone wants to try this at home or indeed anywhere else… we ran the workshop on Linux, in a lab with OpenCV and Python installed. You can find the instructions for the kids here: worksheets for kids on Google Docs, and the code samples broken down by week here: https://github.com/handee/ARCfacegame. Let me know if you use them.