Innovation-July-August-2023
ROOTS OF AI GO BACK TO THE 1950s HISTORY OF AI
Work), Henrey says more direction is needed. “What would be helpful is to expand existing guidance and include AI in scope somehow because I think people are going to be using it more and more as time goes on. I think it would be useful to have an understanding of what some of the considerations are when they’re using it, to understand when it’s appropriate and when it’s not appropriate based on the intended use of the tool, the risk associated with the tool, and the knowledge of the underlying systems,” he says. Henrey says the next generation of engineers are paying attention to trends in the field. Algorithms and types of machine learning are fast evolving. While, generative AI has taken off with ChatGPT, it may not be dominant in the future. “There’s a lot of interest right now in big data, data processing, and machine learning tools for data analysis. Lots of students are looking for opportunities to expand their skill sets in those areas,” he says. NOT ALWAYS OBVIOUS WHAT AI IS DOING Simon Diemert, P.Eng., says sometimes it’s hard to spot AI when it’s operating in the background. The systems and software engineer at Critical Systems Labs in Vancouver has a degree in software engineering and a master’s in computer science from the University of Victoria. Diemert’s work involves design and analysis of safety-critical software intensive systems, safety assessments, and validation. His scope of work has covered automotive, rail, medical, and marine. He suggests a balanced approach when looking into AI.
The term “artificial intelligence” was officially coined in 1955 by a US mathematician and computer scientist. John McCarthy, then a professor at Dartmouth College in New Hampshire, proposed a two month, 10-person study to take place in the summer of 1956, called the Dartmouth Summer Research Project on Artificial Intelligence. “The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it,” says the proposal. The event captured imaginations, and those who attended scattered and spread the idea of machine learning, hopeful they would make machines that could think and reason like us. However, during the ’60s and ’70s, the field of research went through the “AI winter” as funding diminished when the realization sunk in that building truly intelligent machines was much more complex than initially anticipated. AI experienced a resurgence in the ’80s and ’90s, with researchers exploring new approaches, such as machine learning. Rather than trying to explicitly program rules, they taught computers to learn from data. In the early 21 st century, AI became increasingly advanced as computers became more powerful and the Internet was flooded with an ocean of data.
I llustration : AI creation
Then in 2012, a deep-learning algorithm called AlexNet won a noteworthy computer visual recognition competition, significantly outperforming others in the field and sparking a revolution in AI deep learning. Neural networks, inspired by the structure of the human brain, have since shown unprecedented advancements. AI is now becoming increasingly woven into our everyday lives through virtual assistants, like Siri and Alexa, algorithms that recommend products and entertainment tailored to our individual tastes, and vehicles that navigate the streets avoiding pedestrians. It powers autonomous drones, diagnoses diseases, and even composes music.
I N N O V A T I O N
J U L Y / A U G U S T 2 0 2 3
1 5
Made with FlippingBook - Online Brochure Maker