Sri Harsha Gajavalli conducted a session on 22nd February 2020, on neural networks. It was aimed at the beginner audience with little or no knowledge of the topic. The talk started of with some fun binary question answer session involving the audience and was continued throughout the session, which held everyone's attention. He began the topic stressing on the need of and the type of problems needing machine learning solutions. Justification for classifying a problem as a chine learning problem is a necessary first step. A simple way to find that out is to find out if you have a very large number of if conditions while trying to find solution to a problem. He further explained that the terms AI and ML have significant difference and cannot be synonyms. AI is used in decision making whereas ML is used for leaning from datasets. They are as different as learning and remembering are. An interesting example of biological neural nets is the pigeons as art critics experiment where pigeons are trained to differentiate between paintings of different artists. The best part being the high accuracy on testing them. ANN : They have 2 components - 1] nodes 2] weights Neural networks can only take numerical values as input data. So if you have categorical data, you have to express it as numeric data before proceeding to the neural network. The node has the structure: input -------> node (some hidden mathematical function) --------> output The hidden function can be an activation/squashing function. They are use to introduce non-linearity in the data which helps the network to provide accurate predictions as also to limit the output. The weight of the node helps to decide which feature is important. Feed Forward Net: Here the information flow is unidirectional. Data is passed to input layer, then to hidden layer and finally to the output layer. To select the right weight for a node, back propagation can be used. That is we start of with some small random weights and then calculate the error i.e. the difference between the predictions and the actual values and then adjust the weights to minimize the calculated error. This process of error minimization is known as gradient descent. The advantage of this approach is that it works relatively fast. But it needs a training set and at times can be extremely slow. Example for this can be voice recognition: recognize the voices of 2 persons A feed forward network is used and the following steps are repeated multiple times: 1] Present the data 2] Calculate error 3] Backpropagate the error 4] Adjust the weights Some of the applications of feed forward nets are: Pattern recognition, stock market predictions, sonar mine/rock recognition etc . Recurrent Networks: Here the data flow is multi-directional as the nodes connect back to other nodes or themselves. They are mainly used for language tasks. The differ from simple ANNs in context as they have a sense of time and memory of previous states. Convolutional Neural Networks: Here the kernels are used as local feature detectors. The main concept used here is that the neural network learns which kernels are most useful and then use these kernels across the entire data and then reduce the number of parameters and variance. Example of use of CNNs on image data: In case of an image, using kernels directly on the image excludes the edge pixels. Padding is used to avoid this effect. Stride is the step size of the kernel to move across the image. If the horixontal and vertical step-size is different, the output has reduced dimensions. In images there are multiple channels, i.e multiple numbers are associated with a single pixel location. The number of channels is known as depth. Kernel has the same depth as the image and each kernel outputs a single number at a pixel location. Pooling refers to shrinking dimensions of an image. There are different kinds of pooling operations like max-pool, average-pool etc. The session was a really interesting one and an important first understanding in entering the world of neural networks. Written by Manasi Khandekar.
1 Comment
|
Archives
February 2020
Categories |