A beginner's guide to some current AI buzzwords

academics artificial intelligence computer science
By Bowen

Artificial Intelligence (AI) is not just a futuristic concept. It is already shaping the world around us. However, diving into the AI realm can feel like stepping into a world filled with jargon, buzzwords, and acronyms, which may be daunting to a beginner who just wants to find out what all the hype is about or understand some heated Twitter/X threads (and believe me, there are many). This guide will unravel the mysteries around a select few AI buzzwords. Please keep in mind that the following is not expansive and is intended for a beginner audience and does not dive deep into technical background. Those who are interested in reading more can follow the included links. 

Artificial Intelligence (AI) 

Let's start with the basics. AI refers to machines or computer systems that mimic intelligent human behavior, essentially a computer that can learn and make decisions without explicit programming – that's AI at work. It's the brains behind smart assistants like Siri, Alexa, and ChatGPT. 

Machine Learning (ML) 

Machine learning is a subset of AI that enables systems to learn from data and improve their performance over time. It is like teaching a computer to recognize patterns from data and make predictions without being explicitly programmed for each task.  

Deep Learning (DL) 

Now, let's go deeper. Deep learning is a specialized field of machine learning that involves neural networks inspired by the human brain's structure (well, kind of). These huge networks learn to perform tasks by analyzing vast amounts of data. Deep learning powers image and speech recognition in your phone and is also being increasingly used in science, healthcare, remote sensing, and so on. Nowadays, whenever AI is mentioned, it almost always includes methods from deep learning.  

GPT 

When ChatGPT was first released by OpenAI in 2022, the term GPT started trending outside of the AI realm. GPT, short for generative pretrained transformers, is a type of deep learning model that excels at understanding and generating human-like text. Developed by OpenAI, GPT models have shocked the internet with their ability to generate coherent and contextually relevant text, making them the powerhouse behind chatbots, content creation tools, and much more. Some important works in this area include: GPT-1, GPT-2, and GPT-3. 

Transformer 

GPT, among other models, owes its power to the transformer architecture (the T in GPT). This innovative architecture allows the model to capture relationships between words in a sentence, understanding context and generating coherent responses. Other than in the GPT models, transformers are also used in many other important models including BERT and T5 (you can probably guess what the T’s stand for). Since the model architecture is extremely general and powerful, it has been adapted to realms beyond language, including images, audio, videos, and even genomic data. Transformers have revolutionized deep learning and has been the celebrity of AI ever since it was first introduced in this research paper in 2017. See this for a high level overview and this for an implementation walkthrough. 

Multimodal AI 

As AI models continue to scale, multimodal AI (AI that can process and/or generate data of multiple types or modalities) has become increasingly popular. In a world where information comes from various sources, multimodal AI tries to make sense of it all. This approach involves processing and understanding data from multiple modes, such as text, images, and audio, simultaneously. This enables the creation of models that can also convert data from one modality to another, such as image captioning, image generation (the field behind the controversial area of AI art) or even music generation, as well as models that can understand and produce both images and text (such as the latest version of ChatGPT and DALLE-3).  

Bowen graduated from Harvard University, magna cum laude, with a joint degree in computer science and statistics. Currently, he works as a researcher in AI for healthcare. He will start a PhD at Stanford in 2024.

Comments

topicTopics
academics study skills MCAT medical school admissions SAT college admissions expository writing English MD/PhD admissions strategy writing LSAT GMAT physics GRE chemistry biology math graduate admissions academic advice ACT interview prep law school admissions test anxiety language learning career advice premed MBA admissions personal statements homework help AP exams creative writing MD study schedules test prep computer science Common Application summer activities mathematics history philosophy organic chemistry secondary applications economics supplements research 1L PSAT admissions coaching grammar law psychology statistics & probability legal studies ESL dental admissions CARS SSAT covid-19 logic games reading comprehension engineering USMLE calculus mentorship PhD admissions Spanish parents Latin biochemistry case coaching verbal reasoning DAT English literature STEM excel medical school political science skills AMCAS French Linguistics MBA coursework Tutoring Approaches academic integrity admissions advice astrophysics chinese gap year genetics letters of recommendation mechanical engineering Anki DO Social Advocacy algebra art history artificial intelligence business careers cell biology classics dental school diversity statement geometry kinematics linear algebra mental health presentations quantitative reasoning study abroad tech industry technical interviews time management work and activities 2L DMD IB exams ISEE MD/PhD programs Sentence Correction adjusting to college algorithms amino acids analysis essay athletics business skills cold emails data science finance first generation student functions graphing information sessions international students internships logic networking poetry proofs resume revising science social sciences software engineering trigonometry units writer's block 3L AAMC Academic Interest EMT FlexMed Fourier Series Greek Health Professional Shortage Area Italian Lagrange multipliers London MD vs PhD MMI Montessori National Health Service Corps Pythagorean Theorem Python Shakespeare Step 2 TMDSAS Taylor Series Truss Analysis Zoom acids and bases active learning architecture argumentative writing art art and design schools art portfolios bacteriology bibliographies biomedicine brain teaser campus visits cantonese capacitors capital markets central limit theorem centrifugal force chemical engineering chess chromatography class participation climate change clinical experience community service constitutional law consulting cover letters curriculum dementia demonstrated interest dimensional analysis distance learning econometrics electric engineering electricity and magnetism escape velocity evolution executive function fellowships freewriting genomics harmonics health policy history of medicine history of science hybrid vehicles hydrophobic effect ideal gas law immunology induction infinite institutional actions integrated reasoning intermolecular forces intern investing investment banking lab reports linear maps mandarin chinese matrices mba medical physics meiosis microeconomics mitosis mnemonics music music theory nervous system neurology neuroscience

Related Content