Tag:深度学习 吴恩达 Deep learning
Category:Deep learning 吴恩达
Article From:https://www.cnblogs.com/xingzhelin/p/9124019.html

Knowledge point

1. Relu(Rectified Liner Uints Rectifying linear unit) activation function: Max (0, z)

The ReLU activation function is commonly used in neural networks, which has the following advantages compared with the sigmoid activation function mentioned in machine learning curriculum.

1、simoidActivation functions are saturable, and usually do not apply simoid as activation function.

2、ReLUThe speed of convergence is faster

2、Common supervised learning application scenarios

3、Structured data and unstructured data

Structured data, structured data, can be represented and stored in relational databases, representing two dimensional data. The general characteristic is that data is represented by a behavior unit and a row of data represents the information of an entity, and the attributes of each row are the same.

Unstructured data without fixed structure data. Documents, images, video / audio and so on belong to unstructured data. For this type of data, we usually store the data as a whole and store it in binary data format.

Semi-structured data, including mail, HTML, report, resource base, and so on, such as mail system, WEB cluster, teaching resource library, data mining system, file system and so on.

4、An illustration of the development trend of deep learning

  The scale and scale of neural network can greatly improve the performance of deep learning.

5、Mathematical knowledge points supplement:

  Sparsity: the number of elements in a group of vectors is 0, which is closely related to the compressibility of the signal.

6、Resources

http://mooc.study.163.com/learn/2001281002?tid=2001392029#/learn/content?type=detail&id=2001701005&cid=2001696104
https://www.coursera.org/specializations/deep-learning

Leave a Reply

Your email address will not be published. Required fields are marked *