Comments
  1. TechnicallyJustin Gage7/20/205 min
    4 reads2 comments
    9.0
    Technically
    4 reads
    9.0
    You must read the article before you can comment on it.
    • mads3 years ago

      This article breaks down a complicated topic into the basics. Good primer before diving in deeper

    • deephdave
      Top reader of all timeReading streakScoutScribe
      3 years ago

      GPT-3 has picked up a lot of buzz for how good it is - it can generate entire published articles, poetry and creative writing, and even code

      OpenAI team trained it with 175 billion parameters, which, according to them, is “10x more than any previous non-sparse language model.” They used a 45TB dataset of plaintext words (45,000 GB), filtered it down to a measly 570GB, and used 50 petaflops/day of compute (1020 operations per second, times 50)