Two exhibitions at the Science Museum in London give retrospects and summaries of current research:
- ‘Robots’ (8 February-3 September 2017) https://beta.sciencemuseum.org.uk/robots
(the following text is adapted from extracts of the panels explaining the showcases)
History : from clockwork to electric batteries and Morse code instructions
The exhibition first traces the history of humanoid robots, showing the key points of change in their modes of activation, first from the clockwork-managed robots of the Renaissance to early 20th-century versions using electrical batteries; ‘Eric’ (1928) had electric motors and a control system. ‘George’ built by Tony Sale in 1949 used a remote control system relying on Morse code instructions sent from a hand-held transmitter. The robots of those early days are on view.
Cybernetics and Artificial Intelligence : robots acting in unpredictable situations
These robots, however, needed a hidden human acting on them with a remote controller or microphone. Researchers attempted to devise new types of robots that could act on their own; they studied the workings of the human mind, to create machines reacting to their surroundings like humans: this was in the age of Cybernetics, the 1950s. Tortoise-shaped robots built by Grey Walter in 1951 were fitted with touch- and light-sensors – the notion of ‘sensor,’ which we nowadays use both for real robots and for virtual reality models, appears; they reacted to their environment making decisions, for instance avoiding obstacles, and could act in unpredictable surroundings. The purpose of such experiments was both to apply knowledge of decision-making processes to design robots, and conversely to use the experiments to explore the human mind by modelling it, studying how a combination of brain cells led to complex lifelike behaviour and interaction with new surroundings. Cybernetics brought together researchers from numerous disciplines to recreate the workings of the mind through machinery: this was the study of ‘Artifical Intelligence’.
In the 1960s, robot machines were used in industry, but they only did the repetitive tasks they were programmed to do step by step; they could not adapt like the cybernetic robots.
In the showcases of this section of the exhibition, we see the machines in their surroundings, with photos showing them in action.
Robots and computer vision
In the early 21st century, robots integrated studies of the processing of visual information: how humans tell objects apart and identify them from different angles, with their 100 billion neurons. ‘Lucy’ (2001-06), made by Steve Grand, has an electronic brain of 50000 artificial neurons and can tell one fruit from another. Vision tracking is one component of such research, modelling robotic eyes linked to wires so as to replicate the movements of the eyes constantly repositioning themselves to explore the world and focus on new items, such as ‘Mac-eye’ (Genoa, 2011); this research is based on ‘eye-tracking’ studies started in the 1960s by Alfred Yarbus in Russia, and continued in various laboratories nowadays, with the robot ‘Cog’ exploring another aspect of computer vision, how to identify the direction of someone’s gaze and turn in the same direction. To follow this, we have videos following such processes.
In the successive parts of the exhibition, the main material and the supporting material evolve with the increasing complexity of the tasks given to the robots.
- ‘Our Lives in Data’ (15 July 2016-1 September 2017)
http://www.sciencemuseum.org.uk/visitmuseum/plan_your_visit/exhibitions/our-lives-in-data
The exhibition shows interactive applications demonstrating the uses of large datasets in several fields and their interaction.
Transport
‘Your journeys are data and they are shaping London’: using data from transport cards, TFL (Transport for London) traces journeys to adapt services. On a map of the underground network, you may see at what moment of the day there are more people exiting a station than entering it, and vice-versa: the ineractive maps show that at midday, there are more people taking the underground in the suburbs and arriving at central London stations, whereas in the evening there are more people entering stations in central London and exiting stations in the suburbs.
The graphs can also show the different travel patterns for regular commuters, where the colours showing the times of the journeys are similarly placed for all weekdays, and for people with no regular commute, where the arrangement of the colours differs from day to day.
There are also maps of stations, showing the interchanges and the number of people changing from one line to another, thus offering suggestions for the design of the passenger circuits.
A retrospect shows that in 1939, to study patterns of use of public transport, a survey was done by collecting 4 million tickets in 3 days ; to have them sorted by hand, the process took 6 months. It now takes 0.4 second.
Tastes and personality traits
The exhibition has a a game for visitors based on the idea that liking certain brands is linked to personality traits. On a list of brands, visitors are asked to click on those they like, and their answers reveal their personality traits on a scale from Cautious to Curious, Impulsive to Organised, and other features, which appears on a screen. Then an advertisement is rearranged for each visitor to correspond to the personality thus revealed, for instance for those with a competitive inclination, the advertisement shows them attemping to catch as many of the goods as possible; other ways of redesigning the same advert for the implusive or the anxious customers are shown.
Creative game design for the scientific study of data
Such skills as creative design may be harnessed to the study of data, especially finding patterns in data. The exhibition shows a virtual reality headset to see 3D patterns in large datasets. The 3D graphics designer Matt Ratcliffe shows how his skills can be applied to scientific data, conforming his ‘belief that art and science can solve real-world problems together’. Dr Becca Wilson, a genomics researcher, also stresses the use of the virtual reality display to explore large datasets. Game designers have created virtual reality tools so that scientists trace patterns in large datasets, for instance in medicine.
Thus large datasets are used to adapt to individuals, and tools from one area initiate discoveries in broader fields.