Credits are non-refundable. By proceeding, you agree to the Return Policy.
Article estimations are assuming the the average English language article length is 1,200 words.
Credits correspond to tokens, which are common sequences of characters found in text. A token translates to roughly ¾ of a word, so 100 tokens ~= 75 words (in languages with non Latin alphabets like Greek, Arabic, Chinese, etc. words and characters produce more tokens). Processed words are tokenized inputs (e.g. an article and prompt) and outputs (e.g. a summary). You can calculate the number of tokens in a sentence using OpenAI's Tokenizer.
Below is an example of a ~1,000 word New York Times article. A short summary costed 1,314 credits.