Great excitement in 47b this evening as a new member of the family arrives, in the shape of an Anycubic Photon 3D Printer. This is a resin printer, which should be considerable smoother on a print than the PLA machine that we have in work (we make software – totally unrelated).
First print has kicked off, which hopefully will result in a translucent green Totoro in about 6 hours time. Or, indeed some intolerable resiny mess all over the place and disappointment for all. With the PLA printer, I think it took 4 prints to get all the correctly levelling, pre-heating and such jiggerypokery practiced enough to ensure regular successful prints.
Next up – go to bed and set the alarm for 4am to check this fella, or stay here and be overcome by resin fumes before midnight.
Several years ago I catalogued all my paper books using the barcode scanning capabilities of the Goodreads app and it was a task. I wanted a handheld zapparoony thing so I could just bip my way through the piles and then upload, recognize and store titles, ISBNs, etc. No reviews or sharing, just fast maintenance of my own library of 500 or so books.
“To the internet!”, I cried, only to have my hopes dashed by the pricetags on the zapparoony things that were nudging up over the $200 mark. Far too much, so the project was abandoned.
Thanks to the fabulous trove that is Ali Express, I picked this little yella fella up for the princely sum of $24. It’s 433Mhz wireless USB, is rechargeable, has an inventory mode wherein you can bip all day and then upload all the bippage at your leisure. And, the thing identifies as a keyboard! It basically writes to STDIN. In the photo there you will see an emacs window on the right with just numbers in it, these are the data from the scanner.
Next steps – automation. After a interlude of heavy bipping, I’ll need to upload all the numbers to somewhere, and have an automated process do the lookups to flesh out the details of the books. On the local machine I’ll write a wee app to do the grabbing of the STDIN and the upload to wherever the storage will occur.
All of this will happen in my copious spare time.
After all the guff in the online “press” regarding there being some kind of a state-actor threat through infiltration of Supermicro and corruption of the server board supply chain, I get this on LinkedIn.
If you haven’t been following the aforementioned kerfuffle – see
Someone is going to get a book out of this one.
This is a tiny chapel sitting on a promontory reaching out into a lake near Gougane Barra in West Cork. It was an iPhone 6 shot – it’s been cleaned up and messed with a bit, with Affinity Photo, an app I have no idea how to use properly.
It’s because all of the Mojave betas broke my Lightroom.
You only need to do three things:
Apparently that’s it.
Now, go and practice for 10 years.
Oh yes, that Otia article is indeed about me.
This little fella is the BME680 breakout board from Pimoroni. The BME680 itself is the little silver box with a hole in it. It’s an environmental sensor made by Bosch that seems to be able to do pretty much everything – it detects gas (oo-er), pressure, humidity and temperature. The gas part there is about air quality, which is why I got it. You see, past Oisin wrote down in his little project book that future Oisin is to make an air quality monitor, and a humidity monitor, and while current Oisin isn’t quite sure why past Oisin wanted these things, it seemed like a good idea to do both at once.
It’s an I2C device, and my plan is to hook it up to a Pi Zero W, set up as headless and configured to do I2C warbling. The nice chaps at Pimoroni have provided a Python module for it and of course a tutorial which seems to cover everything.
Aside the fact that a Pi Zero W is actually a computer is has me in a constant state of amazement. But look, I am ssh’ing into the damn thing on my network!
Setting up the Pi Zero is a breeze – you ssh into it, change the password and hostname, apt-get to upgrade packages. Python 2.7 is on the box and that’s what we use to put things together.
Bit of soldering later (no headers on these boards by default) and this is the sensor + pooter package:
Extra packages you need to apt-get install are python-pip, python-smbus and, i2c-tools, to enable further Python package installation, and to get the libraries for communicating I2C protocol to the BME680. Follow the instructions in the tutorial to get the bme680 package, the examples, and run the read-all.py script. You might expect it to work out of the box, and it does!
That’s output from the “burn-in” – temperature, pressure, humidity and the raw resistance reading from the gas sensor. About 20 minutes later the gas sensor value stabilized at around 171kΩ. Just to give it a test, I put a permanent marker of the stinky variety beside the sensor and the resistance reading plummeted to 7.6kΩ. So something is happening at least when in the presence of VOCs.
Next step: put this timeseries somewhere so that I can make GRAFFSS. Also I really need to brush up on Python programming.
A few shots of some of the garden birds that have been hanging around this weekend www.oisin.photo/Birds/
Developers everywhere are having various grades of freakout about the EU’s General Data Protection Regulation - unofficial link to HTML version.
The core of this freakout is ambiguity, and it’s a key insight into how developer brains work, especially when the job to be done is something that seems extraneous, dull, or difficult.
If anything, software development is the action of embodying a swirl of concepts and rules that support the creation of a targeted set interactions with people or things. The process is about entirely eliminating ambiguity from the field.
With this GDPR business, we are seeing the legal profession slop over into development, and legal wrangles are all about ambiguity, and interpretation. This is why no piece of legislation in its own right is enough. It serves as a line where we start, but only after that line has been laid down can we practically begin to understand exactly how it affects the real world of people in all that world’s weirdness, complexity and indeed unexpected evolution and change. There’s no system tests here, folks.
To attempt to understand the whole situation better, I enrolled in an Advanced Diploma in Data Protection at King’s Inns, Ireland’s oldest law school. This is a course that looked at Data Protection in general, with a solid focus on GDPR, but taken very much from a legal perspective, although muggles like myself were also invited to join in.
Well, it was fun times, and the whole thing is a little complex. GDPR helps to harmonize over the 28 (for now) countries in the EEA – everyone had different rules and it was super onerous to have to know everything. GDPR is Regulation, which means it goes as written into the legislation of each member state – there is no diffs permitted. Governments are subject to it: this means that it is illegal, for example, for the DoJ to share conviction data with the DMV for the purposes of the DMV checking up on uninsured cars, because, maybe, some dude reckons that all people with a conviction are bad peoples and ergo won’t insure their cars.
That’s a big picture – the entire reason for the presence of the regulation is to help level the playing field for individual humans who are going up against corporations and states.
To come back to the development aspect. The ambiguities persist. This piece of writing was triggered by @DazeEnd and @joec discussing whether if it was ok to just mark a database record with deleted=yes, since it’s ok to delete data on a harddrive, but the underlying harddrive still has the data on it, marked as released=yes (see note below). That is a valid question, from the point of a developer. My assertion is that from a practical perspective, this is the kind of niggle that we must dismiss – and the kind of niggle that occurs when there’s a job to be done that no-one really wants to do. The result of niggling at this level is eventual upgrade to the atomic version – “we are now not available in Europe” – because it looks so DIFFICULT because there are all these EDGE CASES.
You could think, as a developer, that the lawyers worry about this kind of fine-grained issue. They don’t. This is one of those situations where they say, well, here’s the risk, you have to make a decision, document it, and be ready to back that up in front of a judge should the soup hit the fan.
In this particular case it’s straightforward enough. Are you in control of the presence of data in your database? Yes. It’s up to you to delete it when requested. Are you in control of the data on your harddrive? Yes. It’s up to you to delete it when requested. Are you in control of the operating system implementation or database implementation of deletion? No. Could you get the data back if you wanted to? Yes – but that’s not part of your usual run of business, so why would you explicitly do that? What if some bad dude steals your harddrive and then rummages through it? Ok we are getting a little far-fetched here for most businesses that are not keeping special category data, but if this does happen, then you have failed in your security controls.
I guess my overall point here is that GDPR Compliance is a continuum, not a tickbox. You want to be doing the best you can with it and document why you can go so far and not further. The companies that will be getting the big legislative fines are the guys that are willy-nilly exporting special category data out of the EEA en masse without the knowledge of the people associated with that data. The rest of us just need to muddle along as best we can.
Note – if you are using modern TRIM-enabled SSDs your deleted files are scrubbed immediately.
The AWS Summit London deployed the bejacketed nerd squad and fixed their registration system after 35 minutes of downtime and now I am going to drink some coffee and eat these twisty boys.
I just pledged to support a super worthy thing on Kickstarter – Within: A magazine about leadership for women in design/tech.
Important milestone this morning: managed to set up an AWS API Gateway endpoint with a Lambda authorizer and a Lambda implementation without resorting to a tantrum. I think I know how it all works now.
Famous last words.
Update I thought I knew how it all worked until I decided to automate the whole thing with Terraform. I know nothing.
It’s high time I got to know Swift (the programming language) better, so today I ran through one of the Wenderlich tutorials which has a bit of SpriteKit going on as well – those animations are the business!
I found this picture of Jim Dalrymple on stage in a swathe of old phone snaps. He was doing a talk at the Release Notes conference in Indianapolis back in 2015.
Printing myself a Dalek here.
I’ve also bought a signal generator kit with the most abstruse Chinglish build instructions, a job lot of surface mount ring modulators and some cheap lapel mics. I wonder if I can put embed the mic in the dalek and then you can talk into it and your voice and a sine wave signal from the generator will mix through the magic of the ring modulator to produce a dalek voice.
The signal generator looks like this once it’s complete. It’s super tiny.
I’m relieved to see Flickr bought by Smugmug. After Verizon bought Yahoo in mid-2017, I reckoned that would be the end for the already fairly-stagnant Flickr service and set up a Smugmug account. Now I feel a little more relaxed about the future of my Flickr albums, and hopefully soon we will see import tools and a single subscription.
This is the view from the top of the Al Khalifa tower in Dubai. It’s a long way down to that giant pond!
Organizing in Tech - one story about a company ostensibly managed by gobshites, who went down this road - tech workers unionisation. Super interesting topic for anyone who is working in tech and is being put on a crunch treadmill by a under-qualified management team. Also a reminder that managers should manage, and that leaders should lead.
This is an amazing effort – digitisation of the Bibliotheca Palatina by the University of Heidelberg
When content delivery and storage is as mutable and delicate as bits in a chip, how do we prevent historical black holes opening up in the future past? Future us will be compelled just to see the HEAD revision of history, not the changes as they happened.