Комментарии:
ok i pull up
Ответитьgood job Abdul you are a good teacher, best wishes for your journey ahead in this field
Ответитьwow
ОтветитьWhat If minimum support is not given in percentage????
ОтветитьGreat exaplanation ♥
ОтветитьIf l1 ^l2-->l3 then what is the confidence rule occurence of l1 or occurence of l2??
ОтветитьThanks bro
ОтветитьLove You Bhai Thanks a lot 🙏🙏🙏🙏🙏
ОтветитьWell explained sir! Thank you
Ответитьask. Is it if we only want or are only able to make 3 iterations, does that violate the Apriori concept itself?
ОтветитьYou are a god you helped me in exam
Ответитьthank you so much sir
Ответитьtommorrow is my paper and this vedio really help me
ОтветитьWhenever we got a tend b
And b tends c
How can I calculate support plz tell me
Thanks for making a clear explanation !!
ОтветитьTq for well explanation I clear the whole concept
ОтветитьReally it is very helpful to me now sir tq very much ,even my teacher also can't explain like this sir tq
ОтветитьThank u sir studying just a nyt before exam ur video is very helpful❤️
Ответитьhumny C -> D = A q nahi kia?
ОтветитьCould you please make videos on Sequential Pattern Mining? You break it down beautifully and easy to follow. You are doing a great job
Ответитьwhen we have to c3?
ОтветитьIf the confidence is not came according to the question, what should we write at final rule
ОтветитьA database has four transactions. Let min_sup=60% and min_conf-80%
TID
Date
Items Bought
100
10/15/2018
{K, A, B, D}
200
10/15/2018
{D, A, C, E, B}
300
10/19/2018
{C, A, B, E}
400
10/22/2018
{B, A, D}
1) Find all frequent items using Apriori & FP-growth, respectively. Compare the efficiency of the two-meaning process.
2) List all of the strong association rules (with support 's' and confidence 'c') matching the following meta-rule where X is a variable representing customers, and item i denotes variables representing items (e.g., "A", "B",etc.): Vx C transactions, buys(X,item1) ^ buys(X,item2) => buys(X,item3) [s,c]
Please explain this sir?
Thank you
ОтветитьBro jo L1 h usi ko hum.large itemset bolte h..
Ответитьgood explained
ОтветитьA^c-->d how to calculate
Ответитьare Abdul bhaiya kaha se kahaa pohoh gyr hatss off bro
ОтветитьIf my frequent itemset is like: {1,2,3,4} then how association rules will be created?
ОтветитьHow can we get 2 as I know it's answer is 8.
Ответитьthanks very helpful in Apriori Algorithm
Ответитьlove from kerala , ktu student'
ОтветитьThanks ! Clearly explained ! This is what i was in search of !
ОтветитьAwesome explanation
ОтветитьThankyou sir 🙏
ОтветитьBest......teacher👍🏻
ОтветитьMashaALLAH good brother
Ответитьgreat Allah Ap ke ilam ma aur izafa karay ameen
ОтветитьHi it would have been good if u explained outline of why we calculated support and confidence and what is its use in apriori algorithm
ОтветитьSir if it was A^c->D what would be confidence then please Tell i have an exam on fri
ОтветитьOr ager dono answer below than minimum confidence ho to Kya final answer ho ga
ОтветитьBro ippudu A and C tends to D oste confidence ela find chestham I mean to say 3 items ochinappudu
ОтветитьTry to explain in English.. Plzzzzz...
ОтветитьThanku for this video bhayya. .it's really helpful for my exam...
ОтветитьPerfect explanation 🥰😌
ОтветитьThanks mere bhai....👏👏
ОтветитьGood work bro aacha samjaya agr koi language aati h programing ki toh video banao uplod maro
Ответитьexplaining it like a pro..
Ответитьthanks sir
Ответитьnicely explained
Ответить