Steven Berlin Johnson has an interesting blog entry on his use of DevonThink Pro:

Over the past few years of working with this approach, I’ve learned a few key principles. The system works for three reasons:

1) The DevonThink software does a great job at making semantic connections between documents based on word frequency.

2) I have pre-filtered the results by selecting quotes that interest me, and by archiving my own prose. The signal-to-noise ratio is so high because I’ve eliminated 99% of the noise on my own.

3) Most of the entries are in a sweet spot where length is concerned: between 50 and 500 words. If I had whole eBooks in there, instead of little clips of text, the tool would be useless.

I have been using DevonThink Pro for around half a year now and am quite happy with the rapid text writing possibilities, and the very nice contextual and semantically-based search features. My problems are related to point 3, since I have added my whole research library to the system which generates quite a lot of “noise” in the system, both in terms of slowing down everything but also making searches less precise.

Published by


Alexander Refsum Jensenius is a music researcher and research musician living in Oslo, Norway.