The guide incorrectly stated that the AlexaTM 20B model is not a
transfomer-based model, but it is. It is architecturally distinct from
OpenAI's GPT, but *is* built on transformers.
Early feedback asked to clarify:
- General references to compute and memory resources (and scaling
models)
- Formatting improvements
- Prompt and token section clarifications
- Expanding on token limit strategies
Thank you, Muzz!
This adds the standard CODE_OF_CONDUCT.md that Brex uses in open source
projects, along with our standard CONTRIBUTING.md guildelines and
CONTRIBUTORS.md list.