[ JUPY ] Shortcuts in Jupyter Notebook
Posted on
|
In
Programing
,
Python
,
Memo
- This notes covers many shortcuts can be used in Jupyter lab and Jupyter notebook
- Shortcuts is essential especially in jupyter lab
[ HOML ] Chapter 02 - A Full Project
Posted on
|
In
Programing
,
Python
,
Hands On Machine Learning
A complete project which display every detail of machine learning
Every process of ml is covered in this note
[ Memo ] Connect a List of Lists and Seperate Them
Posted on
|
In
Programing
,
Python
,
Data Cleaning
[ HOML ] Chapter 1.5 - Machine Learning Project Checklist
Posted on
|
In
Programing
,
Python
,
Hands On Machine Learning
- This note covers the checklist listed in the book
- It mainly represents a normal workflow of ml project
[ LINX ] Compress Files and Dirs with Tar
Posted on
|
In
Linux
,
Fundamental
- Compressing files to accelerate the transfer process
- The notes will cover some examples and mainly tar command
[ HOML ] Chapter 01 - The Fundamentals of Machine Learning
Posted on
|
In
Programing
,
Python
,
Hands On Machine Learning
- A famous up on YouTube recommand a book called \
- The notes covers: Why use machine learning, types of machine learning system, Main Challenges of Machine Learning
[ ISLR ] Chapter 05 - Use R to Resample
Posted on
|
In
Programing
,
R
,
ISLR
- As is described in chapter 5, various method can be used to estimate test error rate
- This notes would covers The validation set approach, LOOCV,k-Fold CV, Bootstrap
[ ISLR ] Chapter 05 - Resampling
Posted on
|
In
Data Analysis
,
Statistical Learning
- To estimate test error properly, there are different ways to do this. For example, some methods make a mathematical adjustment to the training error rate to estimate the test error rate. Other methods like cross validation, holds out a subset of the training observation from the fitting process.
- The notes will cover leave on out, k-fold, and bias-variance trade off of Cross Validation, and Bootstrap.