Machine Learning Concepts/Terms
- Machine Learning: Computer algorithms to learn without explicitly programmed. When applying to new problems/domains, we only needs to change the data instead of algorithms.
- Machine Learning ≈ Looking for a Function From Data
- Supervised Machine Learning: The computer learns how to perform a function by looking at labelled training data. It is just a branch of machine learning.
- Cost Function: Qualify how bad the learning it is by comparing the actual value and the predicted value. It is a way to measure the accuracy of each prediction during the training process.
- Gradient Descent: An interative optimization algorithm that can be used to minimize the cost function and find the best weights.
- Optimizer Function: Tell the training tools (like Tensorflow) how we want to train the model (i.e. the algorithm you used to train your neural network). When we run this method, it will perform one training step on the model.
- Features: The values (data attributes) that feed into a prediction model (machine learning algorithm).
- Feature Engineering: Use your own knowledge of the problem (domain knowledge) to choose features or create new derived features that allow machine learning algorithm to work more acurately.
- One-Hot Encoding: Exactly one of those values will be one.
- Curse of Dimensionality: As the number of dimensions (or features) in the data increases, the number of data points required to build a good model grows exponentially.
- Grid Search: It is a way to find the best hyperparameters for a machine learning model by trying many different possible combinations.
- Step Function: The function provides an output of zero or one.
References: