National Institute of Technology Rourkela

राष्ट्रीय प्रौद्योगिकी संस्थान राउरकेला

ଜାତୀୟ ପ୍ରଯୁକ୍ତି ପ୍ରତିଷ୍ଠାନ ରାଉରକେଲା

An Institute of National Importance

Syllabus

Course Details

Subject {L-T-P / C} : EC4701 : Soft Computing Laboratory { 0-0-2 / 1}

Subject Nature : Practical

Coordinator : Samit Ari

Syllabus

Module 1 :

1. Implementation of a 2 input AND and OR logic function using perceptron. Start with different set of initial weights and show that there is more than one solution to the problem.

2. Develop an algorithm using Hebbian Learning rule to solve NAND and NOR problem. Assume the elements of initial weight matrix as random value and study the effect of different learning rates.

3. Solve the following Boolean function using the Perceptron learning rule take this Boolean function as user input.
z = x1+x2+x1x2 and z = x’1x’2x’3 + x’1x’2x3 + x1x’2x’3 + x1x2x3

4. Consider this function f(x)= x3 – 2x2 -1 and find out the roots of this function by using Newton’s method take the initial guess x0= 1. And also Consider this function f(x1,x2)= x13 + 2 x1x2 – x12x22 and find out the by using Newton’s method (2nd Order approximation) take the initial guess x0= [1,-1]T.

5. Demonstrate that EX-OR and XNOR gate is a non-linearly separable problem. Design an MLP for the purpose and train it using BP algorithm. Assume the use of a logistic function for nonlinearity.

6. Given the dataset [-2,-1,0,1,2] with targets t(-2)=0, t(-1)=0.25, t(0)=0.5, t(1)=0.75 and t(2)=1. Determine the weights of all neurons with sigmoid transfer function such that the MSE is almost zero ,use MLP with one neuron in the input layer, five neurons in the hidden layer and one neuron in the output layer.

7. Again solve the given problem by using single perceptron and train it with the help of Back Propagation Training Algorithm and show a comparison of MSE for both Architecture in a same graph and with same learning rate and convergence conditions.

8. Mini Project

Course Objective

1 .

To understand the theory of Soft Computing and its applications.

2 .

To understand different learning approaches in neural networks.

3 .

To understand different kinds of perceptron mechanisms and RBF.

4 .

To develop applications for different real-life use cases.

Course Outcome

1 .

CO1: To solve linearly separable problems using universal logic gates
CO2: To implement non-linearly separable problems using logic gates.
CO3: To implement and acquire knowledge on Hebbian Learning
CO4: To implement Multilayer perceptron, Backpropagation and Radial basis function.
CO5: To be able to apply for different real-life applications.

Essential Reading

1 .

S. Haykin, Neural Networks - A Comprehensive Foundation, Peasrson Education, India

2 .

Jang, Sun and Mizutani, Neuro-Fuzzy and Soft-Computing – A computational approach to learning and machine intelligence, Prentice Hall of India

Supplementary Reading

1 .

Satish Kumar, Neural Networks: A Classroom approach, Tata Mcgraw Hill

2 .

Martin T. Hagan, Howard B. Demuth, Mark H. Beale, Neural Network Design, Thomson