[Corpora-List] Leave-one out vs. 10-fold cross validation

Georgios Paltoglou gpalto at gmail.com
Mon Apr 26 10:20:20 CEST 2010


Hello to everyone,

I just wanted to ask whether anyone is aware of any formal reasons (e.g. error distribution, decreased validity of results) for opting for 10-fold cross validation instead of leave-one out, apart from the obvious reason that it is more efficient and less time-consuming.

My 2 cents thought is that leave-one seems more realistic in the sense that if the overall aim of a system is to provide the best classification for new examples in an "application environment" given some training data, one would naturally train it on the largest possible training subset.

Thank you for your responses.

Best regards,

George

-------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/html Size: 2501 bytes Desc: not available URL: <https://mailman.uib.no/public/corpora/attachments/20100426/fc658755/attachment.txt>



More information about the Corpora mailing list