Course Description
In this course, participants will learn the fundamental concepts of generative AI, including an overview of existing large language models. They will gain hands-on experience using prompt engineering to tailor the responses of generative AI tools, such as OpenAI’s ChatGPT and Google’s Gemini, to produce text, code, images, and other creative content. Participants will also be introduced to more advanced tailoring techniques, such as retrieval-augmented generation and model fine-tuning, which allow for even greater control and customization of the AI's output.
Learning Objectives
Understand the basic concepts of generative AI and how it works
Learn how to use prompt engineering to tailor the responses of generative AI tools
Gain experience using generative AI to produce text, code, images, and other creative content
Explore more advanced tailoring techniques, such as retrieval-augmented generation and model fine-tuning
Apply generative AI to real-world tasks and projects
** This course description was created by Google’s Gemini generative model.
Instructors
Jay Boisseau is an experienced, recognized leader and strategist in advanced computing technologies, with over 25 years in the field. Jay is the executive director and founder of The Austin Forum on Technology & Society, which he created in 2006 and is the leading monthly technology outreach and engagement event in Austin--and now attracts national and international attendees online. The Austin Forum is one of the pillars of the Austin tech scene, providing connections to information, ideas, collaborations, and community overall. In addition, Jay is CEO, co-founder (June 2014), and a partner in Vizias, a small team of passionate professionals with expertise in high-performance computing (HPC), artificial intelligence (AI), technology community building, and technology outreach & event planning. Vizias staff lead, execute, and support the Austin Forum through Vizias Research, Education, and Outreach, a non-profit dedicated to using technology for positive social impact. Jay has held previous leadership positions at Dell Technologies, the Texas Advanced Computing Center (TACC) at The University of Texas at Austin, the San Diego Supercomputer Center, and the Arctic Region Supercomputing Center. He received his doctorate in astronomy from UT Austin, and his undergraduate degree in astronomy and physics from the University of Virginia. For Jay's full professional bio, visit here.
Luke has spent the last 20 years advancing the state of the art in high performance computing and artificial intelligence through roles in academia, finance, and technology.
In 2005 Luke joined the Texas Advanced Computing Center (TACC) and The University of Texas at Austin as a member of HPC research staff and lecturer in the Department of Statistics and Scientific Computation. While at TACC Luke helped in the design, deployment, operations, and programming of more than a dozen Top500 systems from vendors such as IBM, Sun Microsystems, Dell, and Cray. In 2016 Luke became Director of Training and Professional Development at TACC and developed the successful and popular TACC Institute Series of week-long training courses in HPC, Data Analytics, Cloud Computing, HPC Administration, and HPC Leadership. In 2017 Luke made the move to Dell Technologies, where he served as Chief Data Scientist and Distinguished Engineer for HPC/AI in the Infrastructure Solutions Group. While at Dell Luke led the development and publication of dozens of patents in areas such as infrastructure configuration, cloud computing, and containerization.
In 2022 Luke joined market maker and high-frequency trading firm Optiver as Head of Global Research Infrastructure, where he let a global team advance their computing, storage, networking, and software strategies and deployments. Luke holds a PhD in Computer Science from the University of Texas at San Antonio and has worked on many high-profile projects, including providing data processing support for the Nobel Prize-winning LIGO project and introducing performance and parallel scaling optimizations for early transformer neural networks, paving the way for technologies like GPT-3/ChatGPT. He is the author of more than two dozen peer-reviewed research papers.
Outside of work, Luke enjoys science fiction and superhero movies, classical history and Egyptology, golf, and spending time with his wife and 2 children.