Discussion:
[Scikit-learn-general] Stacking Classifier
Dan Shiebler
2015-12-16 03:22:33 UTC
Permalink
Hello,

I have some code and tests written for a StackingClassifier that has an
sklearn-like interface and is compatible with sklearn classifiers. The
classifier contains methods to easily train classifiers on different
transformations of the same data and train a meta-classifier on the
classifier outputs. Where would be the best place for this code? I believe
this class would be a useful addition to sklearn.ensemble.

Thanks,
Dan
Andreas Mueller
2015-12-16 16:03:54 UTC
Permalink
I think stacking would be a nice contribution.
Are you doing loo / cross validation to get the predictions of the first
level?
Otherwise this is basically "VotingClassifier"

And in the "literature" version, all classifiers get the same data. We
need to think about how and if we want to support
passing different representations to the different classifiers. Or is
that just ``FeatureUnion``?
Post by Dan Shiebler
Hello,
I have some code and tests written for a StackingClassifier that has
an sklearn-like interface and is compatible with sklearn classifiers.
The classifier contains methods to easily train classifiers on
different transformations of the same data and train a meta-classifier
on the classifier outputs. Where would be the best place for this
code? I believe this class would be a useful addition to sklearn.ensemble.
Thanks,
Dan
------------------------------------------------------------------------------
_______________________________________________
Scikit-learn-general mailing list
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
Dan Shiebler
2015-12-17 05:09:51 UTC
Permalink
Right now I'm splitting the dataset in half to get the predictions of the
first layer. I can make this configurable.

You're right that the literature doesn't really support different
classifiers working on different representations. I’ll take this out.
Post by Andreas Mueller
I think stacking would be a nice contribution.
Are you doing loo / cross validation to get the predictions of the first
level?
Otherwise this is basically "VotingClassifier"
And in the "literature" version, all classifiers get the same data. We
need to think about how and if we want to support
passing different representations to the different classifiers. Or is that
just ``FeatureUnion``?
Hello,
I have some code and tests written for a StackingClassifier that has an
sklearn-like interface and is compatible with sklearn classifiers. The
classifier contains methods to easily train classifiers on different
transformations of the same data and train a meta-classifier on the
classifier outputs. Where would be the best place for this code? I believe
this class would be a useful addition to sklearn.ensemble.
Thanks,
Dan
------------------------------------------------------------------------------
_______________________________________________
------------------------------------------------------------------------------
_______________________________________________
Scikit-learn-general mailing list
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
Loading...