Token
overview
Summary
A token is a small, meaningful unit used to represent information in computing. In code, lexers convert raw text into tokens that parsers can understand. In security and APIs, tokens carry identity and permissions, often with signatures and expiry. In text processing and NLP, tokens are words or subwords used for analysis. Across contexts, tokens add structure, efficiency, and portability.