cg

changeset 95:a25a60a4bf43

.
author bshanks@bshanks-salk.dyndns.org
date Tue Apr 21 18:53:40 2009 -0700 (16 years ago)
parents e460569c21d4
children 3dd9a1a81c23
files grant.html grant.odt grant.pdf grant.txt
line diff
1.1 --- a/grant.html Tue Apr 21 17:35:00 2009 -0700 1.2 +++ b/grant.html Tue Apr 21 18:53:40 2009 -0700 1.3 @@ -664,9 +664,9 @@ 1.4 We will explore and compare different classifiers. As noted above, this activity is not separate from the previous one, 1.5 because some supervised learning algorithms include feature selection, and any classifier can be combined with a stepwise 1.6 wrapper for use as a feature selection method. We will explore logistic regression (including spatial models[15]), decision 1.7 -trees20 , sparse SVMs, generative mixture models (including naive bayes), kernel density estimation, genetic algorithms, and 1.8 -artificial neural networks. 1.9 -Decision trees 1.10 +trees20 , sparse SVMs, generative mixture models (including naive bayes), kernel density estimation, instance-based learning 1.11 +methods (such as k-nearest neighbor), genetic algorithms, and artificial neural networks. 1.12 +Application to cortical areas 1.13 # confirm with EMAGE, GeneAtlas, GENSAT, etc, to fight overfitting, two hemis 1.14 Develop algorithms to suggest a division of a structure into anatomical parts 1.15 1.Explore dimensionality reduction algorithms applied to pixels: including TODO
2.1 Binary file grant.odt has changed
3.1 Binary file grant.pdf has changed
4.1 --- a/grant.txt Tue Apr 21 17:35:00 2009 -0700 4.2 +++ b/grant.txt Tue Apr 21 18:53:40 2009 -0700 4.3 @@ -525,11 +525,10 @@ 4.4 4.5 \vspace{0.3cm}**Classifiers** 4.6 4.7 -We will explore and compare different classifiers. As noted above, this activity is not separate from the previous one, because some supervised learning algorithms include feature selection, and any classifier can be combined with a stepwise wrapper for use as a feature selection method. We will explore logistic regression (including spatial models\cite{paciorek_computational_2007}), decision trees\footnote{Actually, we have already begun to explore decision trees. For each cortical area, we have used the C4.5 algorithm to find a decision tree for that area. We achieved good classification accuracy on our training set, but the number of genes that appeared in each tree was too large. We plan to implement a pruning procedure to generate trees that use fewer genes.}, sparse SVMs, generative mixture models (including naive bayes), kernel density estimation, genetic algorithms, and artificial neural networks. 4.8 - 4.9 - 4.10 - 4.11 -\vspace{0.3cm}**Decision trees** 4.12 +We will explore and compare different classifiers. As noted above, this activity is not separate from the previous one, because some supervised learning algorithms include feature selection, and any classifier can be combined with a stepwise wrapper for use as a feature selection method. We will explore logistic regression (including spatial models\cite{paciorek_computational_2007}), decision trees\footnote{Actually, we have already begun to explore decision trees. For each cortical area, we have used the C4.5 algorithm to find a decision tree for that area. We achieved good classification accuracy on our training set, but the number of genes that appeared in each tree was too large. We plan to implement a pruning procedure to generate trees that use fewer genes.}, sparse SVMs, generative mixture models (including naive bayes), kernel density estimation, instance-based learning methods (such as k-nearest neighbor), genetic algorithms, and artificial neural networks. 4.13 + 4.14 +\vspace{0.3cm}**Application to cortical areas** 4.15 + 4.16 4.17 4.18 # confirm with EMAGE, GeneAtlas, GENSAT, etc, to fight overfitting, two hemis