TOP
GOGO開學趣,參考書應有盡有
Feature Selection Techniques for Classification and Clustering
滿額折

Feature Selection Techniques for Classification and Clustering

商品資訊

定價
:NT$ 1748 元
無庫存,下單後進貨(到貨天數約30-45天)
下單可得紅利積點:52 點
商品簡介

商品簡介

There are several feature selection techniques that can be used for classification and clustering, including:


Wrapper methods: These methods use a specific learning algorithm to evaluate the importance of each feature. Examples include forward selection and backward elimination.


Filter methods: These methods use a statistical test to evaluate the importance of each feature. Examples include chi-squared test and mutual information.


Embedded methods: These methods use a learning algorithm that has built-in feature selection capabilities. Examples include Lasso and Ridge regression in linear models.


Hybrid methods: These methods combine the strengths of wrapper and filter methods.


Correlation-based feature selection (CFS): This method uses correlation between features and the target variable to select the relevant features.


Recursive Feature Elimination (RFE): This method recursively removing attributes and building a model on those attributes that remain. It uses the model accuracy to identify which attributes (and combination of attributes) contribute the most to predicting the target attribute.


Overall, the choice of feature selection technique will depend on the specific problem and dataset at hand.


The data mining tasks are often confronted with many challenges, biggest being the

large dimension of the datasets. For successful data mining, the most important criterion is the

dimensionality reduction of the dataset. The problem of dimensionality has imposed a very big

challenge towards the efficiency of the data mining algorithms. The data mining algorithms cannot

handle these high dimensional data as they render the mining tasks intractable. Thus,

it becomes necessary to reduce the dimensionality of the data.


There are two methods of dimensionality reduction. They are the feature selection

and feature extraction methods (Bishop, 1995, Devijver and Kittler, 1982, Fukunaga,

1990). Feature selection method reduce the dimensionality of the original feature space by

selecting a subset of features without any transformation. It preserves the physical

interpretability of the selected features as in the original space. Feature extraction method

reduce the dimensionality by linear transformation of the input features into a completely

different space. The linear transformation involved in feature extraction cause the features to be

altered, making their interpretation difficult. Features in the transformed space lose their

physical interpretability and their original contribution becomes difficult to ascertain (Bishop,

1995). The choice of the dimensionality reduction method is completely application

specific and depends on the nature of the data. Feature selection is

advantageous especially as features keep their original physical meaning because no

transformation of data is made. This may be important for a better problem understanding in some

applications such as text mining and genetic analysis where only relevant information is

analysed.



購物須知

外文書商品之書封,為出版社提供之樣本。實際出貨商品,以出版社所提供之現有版本為主。部份書籍,因出版社供應狀況特殊,匯率將依實際狀況做調整。

無庫存之商品,在您完成訂單程序之後,將以空運的方式為你下單調貨。為了縮短等待的時間,建議您將外文書與其他商品分開下單,以獲得最快的取貨速度,平均調貨時間為1~2個月。

為了保護您的權益,「三民網路書店」提供會員七日商品鑑賞期(收到商品為起始日)。

若要辦理退貨,請在商品鑑賞期內寄回,且商品必須是全新狀態與完整包裝(商品、附件、發票、隨貨贈品等)否則恕不接受退貨。

定價:100 1748
無庫存,下單後進貨
(到貨天數約30-45天)

暢銷榜

客服中心

收藏

會員專區