I was lucky to get a ticket to hear Andrew Blake’s Lovelace lecture, on the subject of “Machines that (learn to) See”.
Machine vision works nowadays. Machines can: navigate using vision; separate object from background; recognise a wide variety of objects, and track their motion. These abilities are great spin-offs in their own right, but are also part of an extended adventure in understanding the nature of intelligence through visual perception.
The speaker was Laboratory Director at Microsoft Research, Cambridge and his team was behind the the Kinect technology. He is now Research Director at the Turing Institute.
The lecture covered the history of machine vision over the last 50 years, the rise and fall of different approaches to AI over the decades, and finally the recent successes of analysis-by-synthesis and empirical recognisers.
Phil Nash organised another C++ London meet-up at SkillsMatter last week. The first talk was by Pete Goldsborough, who gave a rapid overview of the Clang tooling libraries. The second talk was by Kris Jusiak, who talked through the motivation and usage of his Boost::DI dependency injection library. This was more relevant to my work because Kris’s example showed how Boost.DI aims to reduce the overhead in setting up test scenarios for GTest/GMock. I’ve been pretty happy with the way my unit tests look so far, but next time I’ll definitely look at whether his injector object could simplify my code.
Maksim gave a very interesting presentation on Machine Learning, from his perspective as a physicist.
Machine Learning, AI and NLP are some of the most exciting emerging technologies. They are becoming ubiquitous and will profoundly change the way our society functions. In this talk I hope I can provide a unique perspective, as someone who has entered the field coming from a more traditional Physics background.
Physics and Machine Learning have much in common. I will explain how the two fields relate and how a physical point of view can help elucidate many ML concepts. I will show how we can use Python code to generate illustrative visualizations of Machine Learning algorithms. Using these visual tools I will demonstrate SVMs, overfitting, clustering and dimensional reduction. I will explain how intution, common sense and careful statistics matter much when doing Machine Learning, and I’ll describe some tools used in production.
Maksim used Jupyter Notebooks for the demonstration parts of his talk. It’s a great way to show snippets of code as well as plotting charts – I’ve also been using it for a Python library that I’m working on.
The big take-away was that the audience should think of machine learning as very accessible – although there are hard problems left to research, there are a lot of materials available on the internet and much can be understood readily, especially from a visual perspective.
This evening’s presentation at the Institute of Engineering and Technology was sponsored by Hitachi on the subject of The Cloud.
As the Public Cloud is seeing explosive growth for modern internet based business and their web native applications, how can traditional IT originations with a more traditional IT landscape benefit from some of these trends whilst maintaining their legacy?
Neil Lewis explained that, despite years at the forefront of Data Services, Hitachi Data Systems is now re-positioning itself as a Cloud Solutions provider, rather than solely provisioning private infrastructure and software support to enterprises. Whether they can compete with Amazon Web Services or Microsoft Azure, time will tell – but Hitachi have decided to adapt rather than see their business model become irrelevant.
Phil Nash presented his ideas on functional C++ to a packed ACCU meeting a couple of weeks ago. He kindly provided the slides on his website.
For the uninitiated, the functional style is often quite a shock, but having written F# for some time, I’m in favour of “modelling computations as evaluations of expressions” as Phil presented it, or the declarative style as it’s often described. I wrote about Higher-Order Functions in C++ recently and Phil touched on that as well.
One of the highlights of the talk was the section on persistent data structures, which share as much of the previous state as possible whenever additional elements are added. For example, an associative binary tree could have a new element added, but retain links to the bulk of the original tree. There are challenges to stay balanced, but often the benefits can be worth it (e.g. a red-black, persistent tree that’s thread-safe because all the data is immutable). Phil also presented a Trie hybrid with hashing – a persistent tree structure, with performance similar to unordered_map, for which the hashing ensures no re-balancing is required.
The finale was a demonstration of pipelining for C++, based on std::optional (available from C++17). The recommendation was to watch Eric Niebler’s Ranges talk from CppCon 2015 for more details.
This evening’s lecture at the IET was given by Chris Aylett of the Motorsport Industry Association. Chris gave a fast-paced overview of the work of motorsport engineers within their own industry and the increasing crossover into other sectors. He is a fan of horizontal innovation, the application of under-used skills and capacity within a firm to satisfy demand from clients in other industries.
This is particularly appropriate for the world-class unique capabilities of R&D-based motorsport suppliers in the UK who are able to resolve disparate engineering problems, and do so very quickly.
Particular examples were given by speakers from Wirth Research, Prodrive and Lentus Composites. The latter were responsible for the design of the Team GB track bikes which did rather well at the Rio Olympics – having been developed in just 13 months.
There was also plenty to reference from the inspirational life story of Sir Henry Royce. Despite having only one year of formal schooling, he became an apprentice engineer and ultimately started his own business making cranes. Not only did he expand into making motor cars and design the first aero-engine to fly over 400mph (which was developed into the famous Rolls-Royce Merlin engine in WWII Spitfires) – he also designed the bayonet lightbulb.
I was thrilled when the IET announced that they were organising a seminar on the European Space Agency’s Rosetta mission. I’d followed the progress of the mission and the audacious landing of the Philae probe on the comet – it was fascinating to meet Paulo Ferri, the Operations Manager for the mission. Also speaking was Mark Bentley, Principal Investigator of the MIDAS instrument on board Rosetta.
I asked Paulo how the agency chose comet 67P as the target for the mission. Apparently, it wasn’t the original choice, but a failed Ariane rocket caused the mission to pause and the launch window for the original comet was missed. They re-examined the list of choices and given that Rosetta and Philae had already been built, the only other suitable comet for a mission of that type was 67P!