Researchers at Oxford University are attempting to recreatehuman thinking patterns in machines, using a language guided imagination (LGI)network. While AI machines would be able to recognise the raindrops,there would be no similar thought process to link the rain with the need for anumbrella. AI machines can now recognise images and process language,but this “continual” or imaginative thinking ability is only restricted tohumans, at the moment. These machines are unable to understand and interpretlanguage in the same way and with the same depth as humans. The “human thinking systems” have a cumulative learning capacity that accompanies them as their brain develops. This system is associated with the prefrontal cortex, the part of the brain responsible for memory processes that take place as people are performing a task. Qi told Cherwell: “I think this work may open a new page of AI. The language system imitates the part of the brain whichextracts quantity information and converts binary vectors into text symbols. For example, if a person notices it is raining, they wouldinternally say, “I need an umbrella” before deciding to get an umbrella. As thethought travels through the brain, they will automatically understand what thevisual input means, and how an umbrella will prevent them from getting wet. The LGI network developed by Qi and Wu has three keysubsystems: a vision system, a language system, and a synthetic prefrontalcortex. Qi told Cherwell: “I think this work may open a new page of AI.” In their paper, ‘Human-like machine thinking: Language guided imagination’, they wrote: “We proposed a Language guided imagination (LGI) network to incrementally learn the meaning and usage of numerous words and syntaxes, aiming to form a human-like machine thinking process.” Further research of the LGI network could lead to thedevelopment of more advanced AI, which is capable of more complex human-likethinking strategies. The vision system contains an encoder that unscrambles theinput or imagined scenarios into abstract population representations, as wellas an imagination decoder to recreate imagined scenario from higher levelrepresentations. Feng Qi and Wenchuan Wu have used the model of a prefrontalcortex to create an artificial neural network, in an attempt to reproducehuman-like thinking patterns in machines. The final component, which also imitates a part of thebrain, is the pre-frontal cortex (PFC) which combines inputs of both languageand vision representations and predicts text symbols and manipulated images. “LGI has incrementally learned eight different syntaxes (or tasks), with which a machine thinking loop has been formed and validated by the proper interaction between language and vision system.” “The paper provides a new architecture to let the machinelearn, understand and use language in a human-like way that could ultimatelyenable a machine to construct fictitious ’mental’ scenario and possessintelligence.” Their work could inform the development of artificialintelligence (AI) that is capable of human-like thinking. Human thinking requires the brain to understand a particularlanguage expression and use it to organise ideas in the mind. The human brainis able to generate mental images guided by language.