0% found this document useful (0 votes)
53 views

ML unit-2

Supervised learning is a type of machine learning where models are trained using labeled data to predict outcomes based on input variables. It involves various algorithms for classification and regression tasks, allowing the model to learn from past experiences and make predictions on new data. Key algorithms include linear regression, decision trees, and the Naive Bayes classifier, each suited for different types of predictive tasks.

Uploaded by

nagasuri lavanya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
53 views

ML unit-2

Supervised learning is a type of machine learning where models are trained using labeled data to predict outcomes based on input variables. It involves various algorithms for classification and regression tasks, allowing the model to learn from past experiences and make predictions on new data. Key algorithms include linear regression, decision trees, and the Naive Bayes classifier, each suited for different types of predictive tasks.

Uploaded by

nagasuri lavanya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 46
ahs UNI SUPERVISED LEBRING &. What is supenised Leavning) Explain hoo it works aves} Involved in TE? Type Supervised Leayning? ® t* Supenis ed Leasning is One 4 the types 4 ML in which mochina are banned tained by ting wet lebelled -heining dela 4 on the buis 9 tint data the machines predict the blp- labelled data meant one ifp data thot is hea dy tagged with the correct ojp: 4 Ds Supenited Leawing he baring def proiided to | athe toodel ucorks ca the Supenitor £ teaches the mode to | Predict the ofp conectty |g Supervised learning is the prcen 4 providing ip data ay well ox conect ofp dala to the NL model: ja The alm Qa Supervised eaxning algorithm i8 fe tind \o mopping Just to map the ifp variable x with the op | Noviable y- | How. Supenised Leonning woovks¢ - | habelled dota | Prediction \ s2o Isodax | os o +l | a | A trio | aa Fel dole, | | | “ae]n Supenised lesring model oe -boined wing labeled dotasety where the model leon bout each ‘pe ¢ Anta ee - |] £ once the: Haining procen is completed «the ete) is. detted onthe bauis qe cata + then if predict “Be ap - gto the above. disgvm we hae detaet of. cittely pa of chops which include: squmer, rectangles, Ariange, polygon « & Now the chal step. is hat ye need to train he Model with each Shape * r 1. tilde ot > Y the Quen shape host sider -£:all’ Be: sides are equal theo it will be labelled as GgUANE Ne noe 2 Pf he qien shope toa 5 Sule: they ik 1S @ triangle | —> Sp the given shape ba 6 equal der “ther Hie ‘labelled 3 Hevagon ee q || sper Ahoining we det our model |b wing the tat, dala | get & the model har tp identify che iil ie | | tthe medel is hey Ayaired for all Mpa shape + when it Find a new “shape it clayitien the shape 6n the “} poais gre; of Sides & predict the i ee ‘lege ‘involed in Supenised Learning Slop 1 fel detersine: the ype taping dolasel «, - [ehep ore tailect the labeled raising dataset: 7 Jep 33 pil he! rating” dattue “int “citing ‘dedael tes t dotael 4 validation data: 6 > i shep Delowine the jp Fealnes of the chaining’ dalasel - sep S*. Determine the suitable algorthon 4uy the model: Such a1 sum, decision tea ek. Farsov Frost sep 6: Erecule athe algerilhy ty -he haining datoret sep 3+ Bralale the Getn¥ady a Medel “toy providing the fest dataset, the matel pediely ‘the cored of “aye it Mean our Mode) js ACeMate: pe gf sepewised Leaning bon oi ply sch 4 There ave’ 1 pet q Supervised ‘Teasing Blanifiny 3 |} Regredtions- 1 Wa \ aR yesion olgoittin: ane wed Hi hae ie a sellin the’ ip’ vaviable:-4 fp: variable» 4 9b is used ty the prediction af continu variabls such voeather fpecauting , market -hendt eles on some of the: lar yeqqeesion alguitim wed in Supervised hearing axe: ' fo) od fit, W Linear Regrestion {OD Regrettor Tet KW Non Wines yeqrelion OV Bayaian fineay Regression 5 A \) poly nomia} Reqettion- ; : i calilctins ats lle % Canifieation : Algorithm axe ubed iohen the of Vorioble ic cotegorical: which meant there ave 2 clouer such a1 ye" oy LIND, ‘male! ov ‘Females Tyuet or \Falte? ‘Human’, tr ‘Animal’, Yeodt oy Dog’ ee cloulficotion ‘algorithms axe wued in’ spam iberi é mn | Detection, aud - detection, Jinag e ‘ctouilcaton Risk ae ¥ Some af. the popular -Clawificetin , algnitboe Used in Suparvited Leasning ave. F : Us Rasdom exert W Decision trea , vib” LogisHe: Regretion ce ov) iw Support vector MachineCavm.) ae , Supenvited , Learnings : A With the, help; of: Superised: Teaming , tbe model! aan. predie she, Ol. on the. bouis prior, experience ' In Supervised learning we can have (On exact jdeor about the claner of bbjeck- : Lx Supervited learning rodeli help wa +0 sohvevosigus veal tin vould problems such as Froud _detection » spam iHtaing ete 0 Hey ’ Pisodonshge b |e Supertited ane mpdell axe not. Suitable: for handling athe Cornpler. tou ks: Coleen w Re & Supervised teaming camot the let dato 18 differen! Lk Training requires lot Diffeentiate bo clay time: edict the conee! dato ovolp it | tom the Araining datarel * foie and Requecsion I Isne (ossification Regression | classitication rmeane to quo up che ofp info a alt ve clowification algorithy pyedictt descrete valve ’ ie I, classification problem cht Ke cd moore Caues clanifieation problem wolth & done is called Binary ehyi- Fication, more than a clawes ig Called rnultichuificetion prbble 90 cloasificerhon model the onget yoriable con take a Wiscete get of voriabla- 19k is a Supervised heowvti Algoihm . . - : i To oases the rnodel. Fit we ned fo taleulale “the pacentage of conect clossificattn: Application + Email spom cele inn, im09e claxifi cation 4rad = OV a af Op \s labelled ioto lord ov || Regression meant “to preclict the olp valve tuning the “Haining ae bet : eon algo nedicls | cobtnuu yalues: : P Regression problem aeguiter the pre ction 4 au Quantit A Recession problern with multisle ifp variable is called multi variable vegretcion problen D Regression model the target Noviahle con take ceptinuout ‘ale i.e; 4ypically @ veal number De is a Supervised learning algorithm « ; FO Aue’ the model fit we coléutble “rost rear squared eyror Predicting voeathey reporks , Predicting Sepie in a y detection ele- Predicting the howe prices ete: , Spee tnd 31 difference blto data ‘ oeewsately 4.odfecth. ke ee use SoMe f) } Eudidean Dislane veprerents the. Common! oma 4 ML: Euchdeah: Distance * in disorte bosed “pelhodt in ML with erompler stance bared methods help to determine -the similarities 4 prints, enabling algarthis to pedoim more the Jurdomental’: distance. meouuretnen ty chutak sboight tne a | dione bho @ data 1 *e is-wed to ae didance bho 4 poinis in @ LD tray | pa. + Euclidean distance Formula is denoled oy 1: I Cy Cyy-y yo Cy) dt Manhattan Distance: te ig named otter athe’ NewYork city: vii ‘ + Manhattan distance calculates distances plo» pont. wing ns. * ant tilly plait prical “‘Aordinal dota: the Formula For Manhattan dielance if denoted %, _ ditiy)= = big i} 4 Minkowski Distances , imeem Distonee once ele the Euclidean + Manhatian Distances gak is scoltulated. on -the bouis 4 pete porometer’ p { ithe formule. denoted * Minkowski a } y * Ca hen I" P= ( $-consine ei ori | csine cosine, sol meoavrel the: ion ‘bh Jou bay her le ee Je: SIMil uy lw 2 veelors based ‘ahis porticulatly wed in, teas Tike: text Sinibnly Lfouhe’ se sat ty Conmsine Similar & rectus ig fol iG the congle “yd chem: ; sued | ua “eco Sim (418) = (os(o) = AB fon BI y azo A K 1S: jameniog: Dati Distance: 4 Hamming! Distance it’ wed with ‘Binary date: ditfen 4% cokutlater the distanee ty counting soy af ha ers “| bl two strings: « ¥ This plays an inprtont dle in dota tronsmision A snot conection coding A [ap fluke EN ' 8 De Ts jarelie] Hansing distance 6/B)= = _ | Whol is, ENN algorithm? explain with an example” (1) What ig R? tearest le neighbour aloof - AD is boved om Supervised Leow ie HRN is tued: for regrewion..4.404 a to chusttion, poten: Bub sit mostly ES owitid ton: problens £¥-NN al gorithm ‘Stores: all the ayollable. dato. 4 cluifies new dato. pinks ae on -the similositics at Means’ that» whenever a nevd dala pi arial it can be clewitied the Suitable , corlegeny by win K-NAL algcibmry * rk IN “ig a non-Perametric, aban that medny’ rEdoan't make Tony © oxvmptiont: on’ ‘underlying. data, leat i ernie ou hax We ‘Nloorithny: because it doesn’ Jeary fiom. the doin data set immedialely- Ht -NIAI algonitin in tats lage it. st stored the dataset £ when Ho. point aniver “ther it cloitien thet data, - & Cat ae : civclef 4 ty 9 Exe Let ud tonsider, we have imac ‘ptor the! identification #3 imager ube Kean we KUN pit. > cadegory - A-clrcle tel : inages ima BD = aa -Oo taleory- R- triangle: training imag es: AA A | R&R | A BThuc Judo datouet iinager are tained. with ‘kon algoritn [40 new dataptinl . astivel U-rewdata point then kK- NAD tthe new coda points) alarm checks or the similaistia 4 | identi thot this | rab point Fou more Ginilnibes a A-cireley 4. this dato. pont | ig -caleguired 4o Ae Grol ical Representations eae | | | AD, a Yap oi pro On. pe 4 | 0 ory - | Le gee we need RENN algenitl al we ‘Bae ond Fey quien yA + eqOYY~ when we have o SE point “the onl dditn “puahe i Find which’ category the ol belongs is Ken valgibo | Hoo does, KAN. alos = ashep It Select te dhe numb ‘K of “the mrp -/glep tecakulate the, Euclidean distance - eh numberof hen ep 3: Toke the KP nearest nel Ibour a8 per Euclideady won Step ys Arnona the zt neighbour! count the ep: Lo ets) uh skep 5 * i the the ae dato. point to that one tor “| wh) he toa ne eighbourt 1 Moximun . dep 6! Dutta re ApheB Hg nish on Qomndete Ayn Pow Dyed Vey cs 8.008 cate 090 a i Before KNN x # Attey K-NN the new dota point is categorized 19, Coleg in : MYA | Euclidean dislonte: 4. fea (ya) 1 SEN RTC sis : » Alu) | Kooy) oo! | ow do itd ate, kia in kcal got: there is no porticulay ody or; omula, 40 determine the bert value ‘ , need Jo, try, all, the valuet -lp find ~the bert “out y SE value must be odd numbers, Sg ig even thoe 6 a chante a equal feoduves Kean the Hy 2 new data elemen! to O ly MY tbe value 4 in cssication - So. that vise cast, low particular category: er ae el, RN pent i ‘ be is! single to. implemen” Be bilo) a He rag ROL ie solu 40 be coy tig de sila lenges lx can be more effective iP the “boining. fa. ig ong: ieodvaritaget ap kennt: o wngt alr anu Disocion) 1 determine the, value 4 K &\ somebimes Heth is ves det erik is! y Gpormplen Slog at yer ibd ott ate at Te we het to F ieule the distance ble oll the datapains inthe Haciring dato. whieh is very ceypensie fine comply eoeaie "I 4 eh -(gla } . ie S "shape - § erica! | h rye ji | | WOR gg Le Le hat ie Nive Bayes algorline in MLf Explain woth ao Cheng Naive Baye! algorithm is Cloujfication Hechnique algority, Nove Boyes iso Supervised learning ak ithe ich is hated on theorem 4 wed for adbing ‘cloutitication problen # MN is mainly wed in text clouidieation & includes high dimensional training datos 5 Olositice, # we We Ata B algoittm whith is a move elie, ; alguithm which helps in designing fort HL model that make quick [Pa Seep clayification al | Pediction are made on the probobi ity a eo jk Somne polos examples where Naive Payet algorithm is Wed | ae spar titer ert clouif ication, sentimental analyis “ vBiby eas Celled a Mave Regtt i; Naiivet ot +s colled Naive hecuue it assumes the ocourences of a. certain Feature ‘is independerit of octuventer | another « | persent ¢q | Feoctuye- oh egal pecan | ext ba fruit ts identified on -the. basic ¢ colvsshapeptaute. | Zeplor-red | Joute- Bppldtavte ' igcthen if ts rete Apple é [A Here each deadufe individual contbuty 10 entity 4 “ik is an npple without depending on each other ‘t | 2 DE is Called Bayer Become hit algocithm is bated on | prinniple of Baye theorem” - ‘ | Bayes “heaven Baye “Theoran nalio Known 01 Bowyer theorel aule_or law: which it tued’ 40 deterrtire the pokobilty pothesis (Selecting More impatont Feotwa, tor por pair knowledge: QW dependt. on the Condition) proto The: formula. Fov__Boye “Theorem ie given OF p(B) =P p/n) PO) : PB) i: ba bility: probability 4 whee Cole) — Posterior it Bon Bethe obterved event B p(elh)~ rdetitnod probability probability of the exidente Given that the probability QF rir ES k hypothais is: Gi hue: Plr)- Prior proba’ ity ¢ probabib t other's ptebo observing the evidence id ie : ad PCR) Ma inal prolmbiltysprobobilty 4 evidence \wlorking gure aye clanifiey- b ba Problem Assumptions: i t-we have a darasel of ewaather conditions on corrdiponding torget variable “plow: : Ry ul this aseunpion woe need tp detide thot tocarther we should play or not ty 'o porticutar aut acceding to the tet pollen wwe need to fofow “he belauo sleps « dep recorvert the given doe set into frquey table dep a8 Generate iletbood table by sinding sprobabilifia af pen Fecrtuver : ; i dlp 3: Use. Baye thee +p calle postaior probably. Dato. set: Piel pea [swo outlook [ph | \ Rainy | aa ye), gun Joye 2 overcast | Ye. g, ., |-overout yer Ss Sunny No} */, 6 Yoiny ya + SuPny ya shes ae 8 overt Ma. , 1 gy aa te: Jo pa No’ {1} | Sunn ye Ue, Jp vaing 7, Mo. 13 | overeyt ya || Ny. | overeout | Yet Julab te a 4st! Frequency table tor the weather conditions) any Te overt 9) & | ‘ rfl Rainy L : Sunny | 3, Piofapptats if tO |e Ukelitood table for x wpearther conditions ¢~ "povatier (No [Yay parat 0. (5 | bhy=o3e Rainy |e jr | ylty=o-29 thy Raye “theoem’ “plyes [sunny = p(Suory yes pla) p{sunny ha) = Sfip =0:33: plsumy) -p3e P(Ya) = oat PLYe [sunny ) = (0:33) oti) Dar = 0-664, | PC Ne] Sunny) = p(SAMYTA POSE) peg) 2 | P (sunny) ~ plied =009 | > Roxs)* (028) Pleumy) = D3 | O3e | P(noleumy) - oul | Hence ik ic taltod pl yerlsunny) > CNo [ sunny) | Ix ence on a. Surny doy player can, play the game | Ad vantages * : yet | gnlaiver Bayer 18 one of tie fost 4 cai ML algorithms to | Predict clow of dotasels ‘ . 1 DE con be wed for bay os well cy rutfi-clu clayifi- | cations: . f qe rite’ lov choice For text claniFieab’on problem - 1 DL perorms well if mutli clout clasiHication .problenu | compar +0 other akqovith | Digadvantogeé : | & Nave Baye ouume thot all Fealores ave indeperctent of | Agclnted sf HL contd kam ‘the. ‘relationthip: bleo -eaturer nor | 2 Applications $ Native Bayer classifia + YE is wed’ for cre "cot 4 DL is ted in medical dath clauiFication it GL is wed in Realtime predictions : DE is Lted in text glouHication, Spam FiHering, Sentimental Analysis \ Lsthere are 3 dypet 4 Naive Bayer Modell \ Gaussian Model & Muttinomiat mode} i 3: Bernoulli Model * > Define Deckion -hee- aks a STEER PPE buling 0. Deckn “cee ul troining & visualizing and Define the Follaoing teams: ti) Entroy UW) Snforrration Gain Ui) Gini Ratio ) Entroptiy: A Dn Mt Entropy mecuiner the level of “impusity w disorder or uncertainity in a given data sets Ie DL is af meric “hot quantities(estimate) the amount infomation ina dataret & it ig commonly cued to evabierte the quality qo model £ it's abllily fo make accurote predictions: 4 Ai ay sole indicoter "oe ures] low eniiopy ue indi Igh acenvacy in a ML moe’ ee ‘1 thigh eo Low entrpy &Mathematic formula for tobopy is 9 ets)=— Fey! Roy Fest es Tar @ yeiCave) outcomes otter splitting 5 Yel and a No Ex Lel us consideY 0 Feature “tho ond & NoG-ve) euttomes : dnitialy 3 no & splitting 4 CMe outeonel) 1 3 Yel 4 feaduved i ' te -(& Jloq(£)- - (4) leg Gi hy i £) AES) abe Ose -O-88 = 11584 = pag eet CD ble) -Ceald) MBI cola) iat Ti log : ceuntauy 7 EL es =. OWS + 06206 ; =0as43 | e(eaines = -@)lq(%)- (%) log Cz ) te es('88] (oq) - 06 (0 ae (ou) ran) = pure — +018 E ro 2 09t ci) formation Gain (i 1G) + sp Information Gain is a, measure how much information is eins by splitting a. get of dato ono. particular Featue: 4 QL is caleulated boy compa ning’ the tnbiopy {te original Qn doje. tothe Enbsopy: eae: two child, sel A Sie information gain value indeatel hot she Feodene ie more eFicient & Heelve adler ¢ pit “ete formula for Tnformatin Golo is qren by | rq (se) = “ )- (go | wee TALS) ya) 1s the informtion gain tox the data set { «+ the Nariable “a! Tor". random vor able Us) Is the Enbiop PY or +he dat ty wcle) i “he cn ery the dataset gireniche voriatle °o Enbopy -@p e(riae ) Gate ara (Z)#0m) ganar oryoe! OF 40-38 Iq: el) eC PA) ‘ons 2 lo9r-0-98] =0'0d e(ti)= — Co)log Ob Cb.) (24) fog (5) Ly ae » [S 5 Cf] = 0538 (0:I068) - weit taon) ae 7 + 05I- eth) pets” (blag (%3) Ee tgs) (is) cs [S81 @[ al. . 1 E Poms} - (0 040) : i ie x08) et) - U8) Jig) {ug 8) CD SI |" a) eps!) -(os) &1) | Wow time ECE = B)voges (Bat = OU ROI + Orb rorrttrore aoe orb aObe he 14 =| e(P-e( PID] = lo-4q-o-4 208 Design Te’ KA decision tee is defined a the Supewised hearing algoitthm ured for claificetion 94 oxll a yeq ress problemy - Soul tication problems: $ However, Ht Is mostly wed for SFL shuclure similar to a tree ushere Infernal Node Yepreents the fecduret of -the dataset, branches 4, ibe ree yepraent the decicion dytile, 4 leaf node seprete pe airkcomel # Detition ‘reer are wed +9 predict an outcome bared on historical data: : & The Dxcigon tree eoork of) the ‘saquente &. iF-then-ele’ gkatement £0 wot udbich is on initial problem +o ole looks lite + a The Decision tree "ase hale \eof rode: | leat node is defined a1 the ofp of the decision node | they do not contain any branche: Thod meant the tiee capt be splrrtecr Furcther!y 7 Root rode’ Poot node it the osigin print of any Decision tree 4 H is Ola eet hich ee ie 4 I contai ive dota se i vided i fas gee divid Dh Or ove 47S rode inchactes rvaktple ‘branches 4 tt ig wed to make any decision .in classifiecction problems / split : t * Tilo process that divides ‘the toot node toto mutliple sub nodes bated on Some defined conditions - Branches: # Branches are formed by splitting the rot node or decision node Pawuning + ts x pruning is defined a1 the proce Yesnoving unwanted | branches rom the tree: child_node* : . HK Esgept the root node all othe nodet, are called child nodes ‘the deci tvee- Using Tds_Algovbm * ; bed aoe ain vx, build a. detision dyee ty a datavl £103 algorithen meant “Dlerative Dicholomicer 3° | stnvol outlook Tempecotine [humidity | tind | Reitan oe i [wot | high = fuak | a hot high | hong | to 1B frovercart | bot high weok fa | rodny mild high uoeak ye. S forcing | cosh... fpgtpal | WK. | Ye | 6 roiny Go norral Sto No + | overeat — | Coo normal clon ye. | S| sey mild | high wok | No 4s} sunny Cool ora) = *] weak” | Yes \ov ‘cing mild normal | ora] Yer no 1} gonty | nig Normal | Strong | yet Wee | pyercaat | | mild high |.strong |yet, | tar] overeat | hot normal — | uoeak™ -/Y¥es, Ly: | rol ‘mild high tong | Noo. | pnsidey the Firs the isk feolae il if ya" | =e | pe 7 Ss e q 0 g Rainy: y bos iid Sviboy le entopy atk Eloutteol)= - Ry Paha ~ % log Pee = (Ga) Malde)-C) 000) a) ST =~ lobe) (-0-65) - (0:35) (4.48) 20:43 + 0S! \ = 04l “Temperotre ya Py Toto! yh hot te 4g mild 4y r L cool el in 4 5. Te etsunny)=Le) loal &) -2) logl 2 =-AeSeg] -c2) [92] A ogy 09 = fou) (132) - (0.6X-0-43) = 0:58 0 UY? = oUt Elovercart)= — (log (2) log (2) “a z io ()-o ECRaing) ~~ (3sloq(3) -Ueegl) --@)fig@]-C] 2-(09 (9-436) lows) = OUR xo St 2th ECPHD = (E) aoat) x0 4 [He] 1076 (039 20:44 4 02KO x(D:39A0°°6 epatyer vx 034 204 Fafornation Gain: Ge ECP) - EH 2 [oa-0-64) dep ts roa sl os shy Neale: Temperine | Tempevalure yer] ae | “rola hot a Lv ; 4 Q fu bo a A) 4 af el we | Eniampy “Jemperodue ! E Crepe) =F) 4) eat Gall) }- Cg] sate tedon Elbol)= Lily AB 2%) logl 2) ~ te) [eae] (2) flog ted # ton Let" | = pay 3 elnild) = epi) ee e et (: a so) 2 pe eferol) = ) Teg cat) 2B) Jeg] - - A) [2 | 2 -(b49)Cour) - (& ey ; =O dt OS = OF 80 ; ECPHE Lata (x0 ¢ (4 20¢0 = (orig) + (o-w)0'I0.4, > +1) X080 = pig 40-348 tou 88 Groton Epi0 * &=| El?) - ete) {o-al- ol O03 veiw Sep 3% Let Us consider’ the feature: ealarty ony ea otto! High J a fy = Domal | ¢ S + Enhopy ‘Humidity: wd] € (Humidity) = (4) 9 (82 )- 0% a = fo pig] = ~(o-bu)(-0°63) - oe 204) eGHgh) = -(Hleq@)- “CG <2 -(3 "Ge! = (owl 122) -(o- sao go) = es a bush € (normal) ce A nt me 'oq (3) = “te ad 4 Jpeg py. cet = -logs)(ron) - : Wt @) = D184 FO39L = Ost c (PIE) = Giq)cob aE) x08F ~ = (os) 20% + (0:9) 1OCF _ 084018 =ot6 Sntermation Gain t 1G=|etP) - -e (| 2 forat- 0-6] e ols Dafertrotion Goin t a "esl >e(e)-E(Pl) = OU-OFF Enbo + (plr)- end“) i) eal 5) Sep a Let ws og dhe Seature+ lind Wind | Weak i | —j Winds ~la)| re). ae fog (4) bald) = -(o mate 63) ~(o° i es = 041 e(Weak) = &) log) -() "9 (% 3) = -[eyfr iP a) DI f)- Ca rls = (or te) bo 1) -to28)CD) = OBIS e[sbrong) 23, bsg (2) 2-Io9(2y 2 ® s = &) | J-@[! ep fa)) = (0 en. (os) Seatos a ae eida B40) = (ost )lpgi+ours 2 OUbITOULA en) =O00L Sep awe reostame -the datoret. ox shown belt: loutlook_| Temperature | Humidity | - coind | yet/alo Sunny Ree J high Ly weak t n0 Sunny hot high sheep m0 Sunny mild high weak no | san Coo) Norma} weak yer Sunny _|_mild cnewal | shoo | | outlook “Tenparatiare [Humidity | wind [pals ; overcoat | hot high [strong | "yes jovercout | coo) novo)" | hy excaut | mild high | weal fet vercout hot nomol weak. A outlook “Femperoture Humidity wind yerxlo 4 Rainy mild igh TRAX Reiny coo} normal, | * week : Roiny Coo} nomal | Shog 0 Rainy mit ian : rises ¢ iny n i aaa hos “hgha Jatorrration Gein C1G) when compare d fo other g0 i 18 selected as parent Node For the ee Ly he diognam hooks” ke - [outfook | Sunny weret —-yoiny oF ) ) ! Entropy omy ElSunng) =P logheyy Res Fey eehg eZ boge 2O4 poe ob get . 20-5240 &Y e (Lemperoture) =O 4+ E (temp=hol)=0 Nd). Llpg ¢: . e Ciemp- mild). ~eolge Slog CE) he | es e ee vi | edpetpeatitd i ECTemp= cool) = ou Jog) ee ac 1) eLparent fates prove pater q =6. Tqe tenn) Tobe e | oe ony a by= oF 3 log 3=0 eae =O | E (porent /Feature)= = fxorpoo T= eee etry Pow 0 “Elusind) = oy + OAH. « ue Eloathe “Ay de lty C4) = Co- aw) f Py (oo ate ine ee o ay! e (sh) = = bg @)- hath) \ elperentffeotwe)- E(P)-€¢P/F) | elpirj= enm szK0 Iq= a * etoe) i {| hi i tid Aa bai hight Ierfomadion me 4 ibis get) 01 And no look (usthook 0 Sumy. , overcost iy idly YO, | he nb : Enbopy (Roiry)=- 2109 (3)~¢ hog’) 2oqt 4 Elfemp)= Sey E(rild)= 9% log, \ i om ‘ Eltoof) = 1 14: €(p)-e(PlP) E(PH)= 2x04 % Yen) 2d: ae 4 Tqsoo] «=; EC windy)> 2044 Hd -ethumidity)-09F elweat)=> vod ethighs blog} - +a, else) lv = Oo syn Elneumal). “Bly ete ESC at-D pt he even 20-90 sls i TG=0:0) a ihe Final decision -lree cing 0 algithn jutlook’. {cuitlodk | tice ee Pe. “ig roime) sey Ste 4 pio ea, oth BLM | 4 Expain cart algpithm or building secon shee to) “this ang ¢-danification a- on ; i Panne ; epee alqpuithen decision awles oe "ind tl fea dey K cart algorithm is kno os “ios al ine) Humidity | wind | Phy, [snio|__outlok sotuve midity win FY] sunng ae Tigh weak yaloo © | sunny = hgh | 3h © | 3: | evacoit hot: high wR , a | H]Q% | Roiy mild nq wet ja | Hage | Rosny Coo} non ook |) 4d | 6: | Rainy enol Novin eon TA | Lys |Owvereat | ceo! novel“ f Serorg | PW S| Surry mild high weak ny 9: | sunny coo! vernal et 4d ‘Hor [Reiny mid bow uoeak ya [ft [Sosy tril rma | shang al las lovewdt |. tnild > |, nigh ste da 13+ [overcoat “dot normal ya Hy: Rainy mils) high cleong nO | xn 9 Tosler. vale ig redotl Yrder. is a mebie br clauification tasks in cart abpoitha ADE gore gush "4 squared piokobilities each chy - $ The Fomvla for! GIN wiht 16 gir p Jor j=) -fo oe cong Sep saint 9 hdew value for outhok. * . outlook ye No ihe Su + 38 pvertas y o Reiry ‘ L 4 Sy Gilat) + fy wes) 8 Gin’ (owtTook = eee Less Coley | 4'n\Coutfook-Rainy= 1-252) =0:G8* oe will calculate the cagliea sum of Qin inder-br cutlook feoture:G (outlook) = XW G Hore 1G = 0-34 Temperature yer No Nov Moices fot LS mild “y L : I cool, gy 1 g Gini Chernp= hot). 1 [CaaeSleos ain Coane fap Ce ore Gini (Lermp =coal) = 1~ est |oe will caleulele Pee ea Gj index for temperature -eatore | GiLeeroperatve)= fo te §vorgury x037 SOMO HAO-1O> O19 | Hurridity yer No Nog Stic | High Saniee 2 Norn}, c ay a | Gris atomidtysigh}= aye ey-ou? \ in COuovicly = shaeng) = | Wyo ty woe colll coltalate he: weighted dum of, 4A Inset ur dy» | Qthunicil) = £ so yarmly -036 | Wind ya vto og) instance is mek 6 OB 8g gig 88 a Gini woied weak) 1. [egy CO Heosie Gini (sind = shron) = 1 [@dCeyf or Gsind) = fg x0-34 hi res Hod DL OGL Bere Qlouttook}=03 | Glenpectune)-O°8 obit); os ‘ eb ‘ the wee for ~he Decision “tee is Sekcled on the bas | | | ai ue: value Alors parent mde mut have [ovo Gini inder valie. 0 D 1 Gini of terperature-br sunny ottHlook So “Temperature yer No Neg instances tad toHet shal ib. in L coo] \ o ee ' mild \ t L Gin (ourtook =sunry -lenrp= Hot} 1- LES (JJ=0 Gis (outlook » Suny 4 temp =ceal)= 1 C4)'s (9)] <0 Gin (ovtfook = SUMY4 “emp =mild)= |- [esc 20 we will caltulate the weighted sus 4 Gini inder’ of Lempevadine Foy Sunny * : f Gini outhok =sunnyd temp) =PEJor(L)p epee Soy Gioi__ Humidity of tu outlook + Numidiky Yer va ol ln a Nh 6 8 i dig J Nima pe OS Tera | Gi eutick = Sor + umily -bab-1- [eefs(4y'T =o | Gin Toustnek = Sumy 4 Humidity = Nomal] - -[)"- (2)"Jen woe will caleutate the welghled Sun of Giri inde of Sumy ox Sumy 4 GiniCouthok= sunny 4Aluridily)= (2) x04 Ceo +0 ini of wind of Sunny outlooks wind yea wt nog Snttances ork yb 3 simg \ 968 < Cytibinedt)y Gjiootak = sonny windewead)=1- [C89 05h] Gini Coutlook = Suny 4 wind = Stverq) zie fay. ays a we vill qaltnlate the uxighled Sum of Gi; Inde wi nd) for omy - : a ‘“ t ‘¢ “| Gini outook = surg usin) = Bo(eer © OFA4DG = V4 | Humic te “UL eet ain No No | No| reetukleees | aa. iy Tttumidi a, . Pog PM PeLEem fra gona wf Ww | Mes Ny fi : ' is ‘ w He | Nv I ¢ v £ [ho | | to we fw {Yee 4 re PP zee ey | x {. 4 g | Ny Gjri_oF Jovpewtae eo youth 3 Temperate No No- g instances - uid 4 \ 3% cool \ \ 24 ipttok = Raa ompenodel= | [ee (APJ 0-44 ear Fa kg eob-1- [ep iC -os Giotto Soa wwe oe

You might also like