def tokenize(myDocument): """ This separates all the words in the passed-in document and puts them into a list of strings. The algorithm to doing this would look at specific phrases and divide up the words via white spaces, etc. Running this function would return a list like, ['The', 'quick', 'brown', 'fox', 'jumped', 'over', 'the', 'fence'].""" pass def bestMatch(word): """ This functions finds the 10 best matches for a particular word according to letter used in the word and the order in which they are used. For example, a passed-in word like, "mtch", would return a list of words like ['match', 'mitch', etc]. It would also find them by using the sub-sequence functions, and use phonetic matching to find the words that sound like it.""" pass def addWord(word): """ Adds the word into the global dictionary list via append command.""" pass def removeWord(word): """ Finds the index to the word and removes the it from the global dictionary list via remove command.""" pass