Learning AI from scratch– category –
-
Chapter 7
[AI from Scratch] Episode 196: BERT and the Masked Language Model — Learning Mechanisms of BERT
Recap: Positional Encoding In the previous episode, we discussed Positional Encoding in the Transformer model. Positional Encoding is a technique that handles the sequence information of words, playing a crucial role in helping the Trans... -
Chapter 7
[AI from Scratch] Episode 194: Multi-Head Attention Mechanism — The Core of the Transformer Model
Recap: The Internal Structure of GPT Models In the previous episode, we explored the internal structure of GPT models. GPT is based on the decoder part of the Transformer and uses techniques such as self-attention and masked self-attenti... -
Chapter 7
[AI from Scratch] Episode 193: The Internal Structure of GPT Models — A Detailed Look at the GPT Series
Recap: Details of Text Generation Models In the previous episode, we explored text generation models in depth. We learned how models like Sequence-to-Sequence, RNNs (Recurrent Neural Networks), and Transformers automatically generate nat... -
Chapter 7
[AI from Scratch] Episode 192: Details of Text Generation Models — Text Generation Using Language Models
Recap: Evaluation Metrics for Image Generation In the previous episode, we explained evaluation metrics for image generation, specifically focusing on the FID score and IS (Inception Score). These metrics are crucial for assessing how cl... -
Chapter 7
[AI from Scratch] Episode 191: Evaluation Metrics for Image Generation — FID Score and Other Methods
Recap: Pix2Pix In the previous episode, we covered Pix2Pix, a model for image-to-image translation. Pix2Pix can be applied to various transformation tasks, such as colorizing black-and-white images or generating realistic images from ske... -
Chapter 7
[AI from Scratch] Episode 190: Pix2Pix — A Model for Image-to-Image Translation
Recap: Conditional GAN (cGAN) In the previous episode, we discussed Conditional GAN (cGAN). cGANs allow for the addition of conditions to the generated data, enabling the creation of data with specific attributes. This capability is usef... -
Chapter 7
[AI from Scratch] Episode 188: StyleGAN — Achieving High-Quality Image Generation
Recap: CycleGAN In the previous episode, we explored CycleGAN, a GAN model that enables style transformation between different domains (e.g., day to night, photo to painting) without requiring paired data. This technology is useful for s... -
Chapter 7
[AI from Scratch] Episode 189: Conditional GAN (cGAN) — Adding Conditions for Data Generation
Recap: StyleGAN In the previous episode, we explored StyleGAN, a model that allows precise control over specific styles in image generation. StyleGAN’s architecture enables the adjustment of particular features (e.g., eyes, hairstyles) w... -
Chapter 7
[AI from Scratch] Episode 186: DCGAN (Deep Convolutional GAN)
Recap: Generative Adversarial Networks (GAN) In the previous episode, we discussed Generative Adversarial Networks (GAN). GANs consist of two models, the Generator and the Discriminator, that compete to generate new data. The generator c... -
Chapter 7
[AI from Scratch] Episode 187: CycleGAN — Enabling Style Transformation with GANs
Recap: DCGAN (Deep Convolutional GAN) In the previous episode, we explained DCGAN (Deep Convolutional GAN), a GAN that uses Convolutional Neural Networks (CNN) to generate high-quality images, particularly in fields like image generation... -
Chapter 7
[AI from Scratch] Episode 185: Details of Generative Adversarial Networks (GAN)
Recap: Variational Autoencoder (VAE) In the previous episode, we explored Variational Autoencoders (VAE), a probabilistic generative model. VAEs compress data into a latent space and can generate new data based on this compressed represe... -
Chapter 7
[AI from Scratch] Episode 184: Details of Variational Autoencoders (VAE)
Recap: Mechanism of Autoencoders In the previous episode, we explained Autoencoders in detail. Autoencoders compress (encode) data and reconstruct (decode) it from the compressed representation. This process is useful for tasks like dime... -
Chapter 7
[AI from Scratch] Episode 183: Details of Autoencoders — Understanding Encoding and Decoding Data
Recap and Introduction Hello! In the previous episode, we discussed Autoregressive Models, which are a type of generative model used to predict the next value based on time-series data. They generate the next step based on the current da... -
Chapter 7
[AI from Scratch] Episode 181: What Are Generative Models?
Recap: Chapter 6 Summary In the previous episode, we reviewed our understanding of model interpretability, highlighting the importance of interpreting models using methods such as SHAP values and LIME. These techniques make it easier to ... -
Chapter 7
[AI from Scratch] Episode 182: Autoregressive Models
Recap: Generative Models In the previous episode, we covered the fundamental concepts of generative models. Generative models create new data based on training data and have diverse applications, such as image generation and text generat...
