联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-23:00
  • 微信:codinghelp

您当前位置:首页 >> Matlab编程Matlab编程

日期:2019-03-29 10:40

CS 659 Image Processing

Homework #4

Covering Lectures 10, 11, 12, 13

NOTES: Submit only the homework “solution” (do not include these homework questions) in

Microsoft Word format to http://moodle.njit.edu/ before the above deadline. Absolutely, no late

submission is accepted. Write the answers in your own words individually. Any plagiarism will

post a “ZERO” score or cause a “FAIL” grade. The course requires Matlab programming. You can

download the Matlab software from the NJIT website: http://ist.njit.edu/software/download.php.

Submit your answer in a Microsoft Word file. Do not include the questions; just provide the

required answers in the file.

Totally, there are 4 questions. Each question is 25 points. Grading policy checks the correctness

and completion of showing resulting images, Matlab codes, and text responses.

Reading material: HW4R

4.1 (25 points) Compute the covariance matrix of the data given by

1 2 3 4 X X X X X = [ ]

of size 2 by 4.

1 2 3 4 X X X X X = [ ]

of size 1 by 4. Note that the covariance matrix is. In this example, K is of size 4 by 4.

Answer:

4.2 (25 points) Develop a perceptron for the AND function with bipolar inputs and targets. The

truth table of AND gate for bipolar inputs is given by

Input Target

What is the decision function (boundary) for this perceptron? Note that the initial values for the

learning rate is set as 1, and the weight and bias are set to be 0. The threshold parameter is also set

to be 0.

Answer:

Dr. Frank Shih

4.3 (Refer to Lecture 11) (25 points)

Consider the two training sets, each consisting of two patterns: Class 1 contains [(1,0,0),(1,0,1)],

and class 2 contains [(1,1,1),(1,1,0)]. Derive the decision function d(x) using the Perceptron

learning networks with C=1 and W(1)=0. Note that for class 1, we let y(1)=[1,0,0,1]T

, y(2)

=[1,0,1,1]T

; for class 2, we let y(3)=[1,1,1,1]T

and y(4)=[1,1,0,1]T.

Answer:

4.4 (Refer to Lecture 13) (25 points)

Probability is different from fuzzy set. Let x = John, S be the set of smart people, and T be the set of

tall people. Let the probability (or fuzzy membership) of mS(x) = 0.9 and mT(x) = 0.9.

(a) What is the probability of John being tall and smart?

(b) What is the fuzzy membership of John being tall and smart? Note that fuzzy AND operation is

the minimum selection.

(c) If we interpret “0.9” as “very” and “a bit lower than 0.9” as “quite”, so the semantics for the

above probability combination yields “If John is very smart and John is very tall, then John is a

quite tall, smart person.” What is the semantics for the above fuzzy combination?

(d) In the example of fuzzy membership of Young in Lecture 13, calculate the membership when

someone reaches the age of 50. Note that when the age is 20, the membership is 1, and when the

age is 23, the membership is 0.9.

Answer: (a)(b)(c)(d)


版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp