chap6_basic_association_analysis

上传人:j7****6 文档编号:62378702 上传时间:2018-12-20 格式:PPT 页数:84 大小:3.20MB
返回 下载 相关 举报
chap6_basic_association_analysis_第1页
第1页 / 共84页
chap6_basic_association_analysis_第2页
第2页 / 共84页
chap6_basic_association_analysis_第3页
第3页 / 共84页
chap6_basic_association_analysis_第4页
第4页 / 共84页
chap6_basic_association_analysis_第5页
第5页 / 共84页
点击查看更多>>
资源描述

《chap6_basic_association_analysis》由会员分享,可在线阅读,更多相关《chap6_basic_association_analysis(84页珍藏版)》请在金锄头文库上搜索。

1、Data Mining Association Analysis: Basic Concepts and Algorithms,Lecture Notes for Chapter 6 Introduction to Data Mining by Tan, Steinbach, Kumar,Association Rule Mining,Given a set of transactions, find rules that will predict the occurrence of an item based on the occurrences of other items in the

2、transaction,Market-Basket transactions,Example of Association Rules,Diaper Beer, Milk, Bread Eggs,Coke, Beer, Bread Milk,Implication means co-occurrence, not causality!,Definition: Frequent Itemset,Itemset A collection of one or more items Example: Milk, Bread, Diaper k-itemset An itemset that conta

3、ins k items Support count () Frequency of occurrence of an itemset E.g. (Milk, Bread,Diaper) = 2 Support Fraction of transactions that contain an itemset E.g. s(Milk, Bread, Diaper) = 2/5 Frequent Itemset An itemset whose support is greater than or equal to a minsup threshold,Definition: Association

4、 Rule,Example:,Association Rule An implication expression of the form X Y, where X and Y are itemsets Example: Milk, Diaper Beer Rule Evaluation Metrics Support (s) Fraction of transactions that contain both X and Y Confidence (c) Measures how often items in Y appear in transactions that contain X,A

5、ssociation Rule Mining Task,Given a set of transactions T, the goal of association rule mining is to find all rules having support minsup threshold confidence minconf threshold Brute-force approach: List all possible association rules Compute the support and confidence for each rule Prune rules that

6、 fail the minsup and minconf thresholds Computationally prohibitive!,Mining Association Rules,Example of Rules: Milk,Diaper Beer (s=0.4, c=0.67) Milk,Beer Diaper (s=0.4, c=1.0) Diaper,Beer Milk (s=0.4, c=0.67) Beer Milk,Diaper (s=0.4, c=0.67) Diaper Milk,Beer (s=0.4, c=0.5) Milk Diaper,Beer (s=0.4,

7、c=0.5),Observations: All the above rules are binary partitions of the same itemset: Milk, Diaper, Beer Rules originating from the same itemset have identical support but can have different confidence Thus, we may decouple the support and confidence requirements,Mining Association Rules,Two-step appr

8、oach: Frequent Itemset Generation Generate all itemsets whose support minsup Rule Generation Generate high confidence rules from each frequent itemset, where each rule is a binary partitioning of a frequent itemset Frequent itemset generation is still computationally expensive,Frequent Itemset Gener

9、ation,Given d items, there are 2d possible candidate itemsets,Frequent Itemset Generation,Brute-force approach: Each itemset in the lattice is a candidate frequent itemset Count the support of each candidate by scanning the database Match each transaction against every candidate Complexity O(NMw) =

10、Expensive since M = 2d !,Computational Complexity,Given d unique items: Total number of itemsets = 2d Total number of possible association rules:,If d=6, R = 602 rules,Frequent Itemset Generation Strategies,Reduce the number of candidates (M) Complete search: M=2d Use pruning techniques to reduce M

11、Reduce the number of transactions (N) Reduce size of N as the size of itemset increases Used by DHP and vertical-based mining algorithms Reduce the number of comparisons (NM) Use efficient data structures to store the candidates or transactions No need to match every candidate against every transact

12、ion,Reducing Number of Candidates,Apriori principle: If an itemset is frequent, then all of its subsets must also be frequent Apriori principle holds due to the following property of the support measure: Support of an itemset never exceeds the support of its subsets This is known as the anti-monoton

13、e property of support,Illustrating Apriori Principle,Pruned supersets,Illustrating Apriori Principle,Items (1-itemsets),Pairs (2-itemsets) (No need to generate candidates involving Coke or Eggs),Triplets (3-itemsets),Minimum Support = 3,If every subset is considered, C61 + C62 + C63 = 41 With suppor

14、t-based pruning, 6 + 6 + 1 = 13,Apriori Algorithm,Method: Let k=1 Generate frequent itemsets of length 1 Repeat until no new frequent itemsets are identified Generate length (k+1) candidate itemsets from length k frequent itemsets Prune candidate itemsets containing subsets of length k that are infr

15、equent Count the support of each candidate by scanning the DB Eliminate candidates that are infrequent, leaving only those that are frequent,Generating new candidates by two operations,Candidate Generation Generates new candidate k-itemsets based on the frequent (k-1)-itemsets Candidate Pruning Elim

16、inates some of the candidate k-itemsets using the support-based pruning strategy,Candidate Generation,Brute-Force Method Number of candidate itemsets: (dk) Fk-1F1 Merge a frequent (k-1)-itemset with a frequent item Still produce a large number of unnecessary candidates Fk-1Fk-1 Merge a pair of frequent (k-1)-itemsets only if their first k-2 items are identical Need an additional candidate pruning step

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 生活休闲 > 社会民生

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号