機械学習の主要アルゴリズム(31~60)– 機械学習でよく使われるアルゴリズムの仕組みを理解します。 –
-
Lesson 59: Regression Evaluation Metrics — MSE, MAE, and More
Recap and Today's Topic Hello! In the previous session, we learned about evaluation metrics for classification tasks, such as accuracy, recall, and the F1 score. These metrics are useful when predicting categories. Today, we’ll focus on ... -
Lesson 58: Model Evaluation Metrics (Classification) – Accuracy, Recall, F1 Score, and More
Recap and This Week’s Topic Hello! In the previous lesson, we explained grid search and random search, two popular methods for exploring hyperparameter combinations. You learned how to use these methods to find the optimal hyperparameter... -
Lesson 57: Grid Search and Random Search
Recap and Today's Topic Hello! In the previous session, we discussed hyperparameter tuning, where adjusting key parameters can significantly impact a model’s performance. Today, we’ll focus on two specific methods used to optimize hyperp... -
Lesson 56: Hyperparameter Tuning – How to Optimize Model Performance
Recap and This Week’s Topic Hello! In the previous lesson, we covered cross-validation, a method for evaluating how well a model generalizes to new data. Cross-validation provides an objective assessment of the model’s performance on uns... -
Lesson 55: Cross-Validation in Detail
Recap and Today's Topic Hello! In the previous session, we discussed regularization techniques such as L1 and L2 regularization, which play a crucial role in preventing overfitting by ensuring that the model doesn't become too tailored t... -
Lesson 54: Regularization Methods – Explaining L1 and L2 Regularization
Recap and This Week’s Topic Hello! In the previous lesson, we discussed overfitting prevention, a method for preventing models from becoming too adapted to the training data, which reduces their generalizability to new data. This time, w... -
Lesson 48: Loss Functions — Evaluating Model Errors
Recap and Today's Topic Hello! In the previous session, we learned about activation functions in neural networks, which play a crucial role in determining the output of each neuron. Today, we will explore loss functions, an essential com... -
Lesson 53: Preventing Overfitting
Recap and Today's Topic Hello! In the previous session, we covered epochs and batch sizes, which are essential elements in model training, influencing the efficiency and convergence speed of learning, particularly when working with large... -
Lesson 52: Epochs and Batch Size
Recap and This Week’s Topic Hello! In the previous lesson, we discussed the importance of learning rate, a crucial factor in determining the speed and accuracy of model training. Setting the learning rate appropriately can accelerate con... -
Lesson 51: Learning Rate and Its Adjustment**
Recap and Today's Topic Hello! In the previous session, we discussed Stochastic Gradient Descent (SGD), a crucial method for efficiently optimizing parameters, especially when working with large datasets. SGD is particularly powerful in ... -
Lesson 50: Stochastic Gradient Descent (SGD) – An Optimization Method for Large Datasets
Recap and This Week’s Topic Hello! In the previous lesson, we explored gradient descent, a fundamental algorithm for optimizing model parameters by adjusting them to minimize the loss function. While gradient descent is a powerful techni... -
Lesson 49: Gradient Descent – A Method for Finding Optimal Parameters
Recap and This Week’s Topic Hello! In the previous lesson, we explored loss functions, which are used to measure a model's error. A loss function evaluates the accuracy of predictions, and minimizing this loss is the primary goal of the ... -
Lesson 47: Activation Functions – Explaining the Functions That Determine Neuron Output
Lesson 47: Activation Functions – Explaining the Functions That Determine Neuron Output Recap and This Week’s Topic Hello! In the previous lesson, we explored the perceptron, the basic unit of neural networks. The perceptron processes we... -
Lesson 46: Perceptrons – The Fundamental Unit of Neural Networks**
Recap and Today's Topic Hello! In the previous session, we learned about the basics of neural networks. Neural networks have a hierarchical structure consisting of input, hidden, and output layers, and they mimic the brain's neural circu... -
Lesson 45: The Fundamentals of Neural Networks – Exploring the Basics of Artificial Neural Networks
Recap and This Week’s Topic Hello! In the previous lesson, we discussed CatBoost, a powerful boosting algorithm specialized for categorical variables. CatBoost offers automatic handling of categorical data and helps prevent overfitting, ...
12