I took a look at this once before, and it took me quite a bit of work to get it actually working. However, it works in a very simplistic manner, which may not be appropriate for you're use. In the basic case it just computes word frequencies and keeps the sentences which have the most frequent words.
The quickest extension to this idea would be to include WordNet to compute frequencies of concepts. Of course in reality you probably would use something better than WordNet...
WordNet is actually quite good. If you're looking for some accessible NLP tools, NLTK (http://www.nltk.org/) is nice. It's written in Python, and uses WordNet in several ways.
The author of ots isn't really interested in working on it anymore. A couple of the features are deprecated, but this isn't documented.
[+] [-] apgwoz|17 years ago|reply
The quickest extension to this idea would be to include WordNet to compute frequencies of concepts. Of course in reality you probably would use something better than WordNet...
[+] [-] brand|17 years ago|reply
The author of ots isn't really interested in working on it anymore. A couple of the features are deprecated, but this isn't documented.
[+] [-] whughes|17 years ago|reply