/images/neuron.png

Hi, there

How to do deep learning using custom Jupyter kernels on Sherlock

A recipe for interactive computing using custom Jupyter kernels on Stanford’s Sherlock. Setting up custom conda environment on Sherlock’s login node 1. Download and install Miniconda 1 2 3 4 wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh # install bash Miniconda3-latest-Linux-x86_64.sh conda config --set always_yes yes 2. Install jupyter notebook/lab and secure your notebooks with a password 1 2 3 4 # install the default py3 kernel for jupyter notebook conda install ipython jupyter notebook jupyterlab # add password jupyter notebook password 3.

Graph

Graphs Some data structures to keep in my mind. BinaryHeap: Complete binary tree MaxHeap: Parent > Both Children IndexMaxHeap MinHeap: Parent < Both Children IndexMinHeap Priority queue (MaxHeap) BinarySearchTree Not always complete binary tree Value: leftChild < Parent < rightChild DenseGraph SparseGraph Code snippets are taken from Play with Algorithm

Priority Queque

min-heap 1 2 3 4 5 6 7 8 import heapq # create a priorty queque in python, and heapq will ensure the list maintains the heap property. priority_queue = [] # add element. The smallest element will always be at the root, i.e., priority_queue[0]. heapq.heappush(priority_queue, (priority, item)) ## Use heapq.

Binary tree

Binary trees Difference Some data structures to keep in my mind. BinaryHeap: Complete binary tree MaxHeap: Parent > Both Children IndexMaxHeap MinHeap: Parent < Both Children IndexMinHeap Priority queue (MaxHeap) BinarySearchTree Not always complete binary tree Value: leftChild < Parent < rightChild DenseGraph SparseGraph Code snippets are taken from Play

Sort

Sort algorithms Code snippets are taken from Play with Algorithm Some algorithm to keep in my mind. selectionSort insertionSort mergeSort quickSort two way three way heapSort 1. insertionSort 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 #include <iostream>#include

What is Big O

What on earth is Big O? Time complexity and space complexity Time complexity O(f(n)): number of commands need to execute. proportional to f(n). 表示运行算法所需要执行的指令数,和f(n)成正。 严格来

Statistical Modeling and Inference

A breif review over the foundations of statistical inference Statistical Models and Inference statistical inference: a formal approach to characterizing a random phenomenon using observations, either by providing a description of a past phenomenon or by giving some predictions about future phenomenon of similar nature. 1. Statistical Models The first step in statistical inference is to specify a statistical model, under some simplifying assumptions (i.

Loss function for multi-label classification

Multi-label classification, tasks commonly be seen on health record data (multi symptoms). Loss function design: Multi binary cross-entropy each class has a binary output Label smoothing, another regularization technique It’s designed to make the model a little bit less certain of it’s decision by changing a little bit its target: instead of wanting to predict 1 for the correct class and 0 for all the others, we ask it to predict 1-ε for the correct class and ε for all the others, with ε a (small) positive number and N the number of classes.