Posts

    • Hidden Markov Chain Model

      Hidden Markov Model

      The whole reason I’m recording this is because RF uses some of properties of Hidden Markov Model or the Kalman-filter model to estimate the hidden state. Notice the difference between HMM and KF is that one is discrete time spaced and the other continuous time spaced.

      The Bayesian network

      The Bayesian network is merely a way to graphically think conditional probability.

    • 浅谈数据脱敏

      背景

      写这个主题的主要原因是在南宁开电力行业信息年会时听到了一个关于大数据脱敏算法的实现过程案例。在该会上北邮的博士后为我们介绍了关于大数据脱敏的一些主要测量模型和基于这些模型的难点和痛点而做出来的分布式算法。本人听这个会之前并无对数据脱敏有太多了解,这次借写稿的机会做了一些关于数据脱敏的知识的调查,希望可以起到抛砖引玉的作用。

    • Lecture Notes from CS 231N

      

      CS 231N side notes

      CNN models

      • KNN
        • The computational complexity of Nearest Neighbor classifier is an active area of research. Approximate Nearest Neighbor (ANN) algorithms can accelerate.
      • Subderivative
        • Basically any derivative in between either end can be used
      • The power of preprocessing
    • Activation Functions in Deep Learning

      Activation Functions

      In this post, I’ll talk about common activation functions and their impact on the neural network.

      • step-function (or the perceptron model)
      • sigmoid
        • Maps the output to the region $[0,1]$.
      • tanh
        • Maps the output to the region $[-1,1]$.
      • RELU