@chriswheatley@mastodonapp.uk @paul try to follow any of them back and if you can’t, your server has blocked bird.makeup
Any experienced programmer worth their salt will tell you that •producing• code — learning syntax, finding examples, combining them, adding behaviors, adding complexity — is the •easy• part of programming.
The hard part: “How can it break? How will it surprise us? How will it change? Does it •really• accomplish our goal? What •is• our goal? Are we all even imagining the same goal? Do we understand each other? Will the next person to work on this understand it? Should we even build this?”
Lifecycle CO2 g/km, Nissan Leaf: 104
Lifecycle CO2 g/km, ebike: 22
Electric cars sold across all of Europe, 2022: 1.5-1.6 million
Ebikes sold just in Germany, 2022: 2.2 million
Panasonic 2170 battery cells, Tesla 3 (short range): 2,976
Panasonic 2170 battery cells, Malibu GT ebike: 65
(= 45 ebikes : 1 Tesla 3—or 140 ebikes : 1 Tesla X)
Cars parked per parking space: 1
Ebikes parked per parking space: ~10
someone who is good at math please help me budget this. my city is dying
@avocado good to hear! Will close the bug soon if no one notices that problem anymore
@mlanger and leave out blind users? That’s a strong take!
"Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar." #gpt4
Join us at 1 pm PT today for a developer demo livestream showing GPT-4 and its capabilities/limitations: https://youtube.com/live/outcGtbnMuQ?feature=share
(comments in Discord: https://discord.gg/openai)
I don't think people realize what a big deal it is that Stanford retrained a LLaMA model, into an instruction-following form, by **cheaply** fine-tuning it on inputs and outputs **from text-davinci-003**.
It means: If you allow any sufficiently wide-ranging access to your AI model, even by paid API, you're giving away your business crown jewels to competitors that can then nearly-clone your model without all the hard work you did to build up your own fine-tuning dataset. If you successfully enforce a restriction against commercializing an imitation trained on your I/O - a legal prospect that's never been tested, at this point - that means the competing checkpoints go up on bittorrent.
I'm not sure I can convey how much this is a brand new idiom of AI as a technology. Let's put it this way:
If you put a lot of work into tweaking the mask of the shoggoth, but then expose your masked shoggoth's API - or possibly just let anyone build up a big-enough database of Qs and As from your shoggoth - then anybody who's brute-forced a *core* *unmasked* shoggoth can gesture to *your* shoggoth and say to *their* shoggoth "look like that one", and poof you no longer have a competitive moat.
It's like the thing where if you let an unscrupulous potential competitor get a glimpse of your factory floor, they'll suddenly start producing a similar good - except that they just need a glimpse of the *inputs and outputs* of your factory. Because the kind of good you're producing is a kind of pseudointelligent gloop that gets sculpted; and it costs money and a simple process to produce the gloop, and separately more money and a complicated process to sculpt the gloop; but the raw gloop has enough pseudointelligence that it can stare at other gloop and imitate it.
In other words: The AI companies that make profits will be ones that either have a competitive moat not based on the capabilities of their model, OR those which don't expose the underlying inputs and outputs of their model to customers, OR can successfully sue any competitor that engages in shoggoth mask cloning.
"We offer no explanation as to why these architectures seem to work; we attribute their success, as all else, to divine benevolence" - Noam Shazeer (second author of the transformer paper, now CEO of Character AI)
from the SwiGLU paper: https://arxiv.org/abs/2002.05202v1
Open source developer. Wikidata, IPFS, Linux, Ethereum. /r/fuckcars enthusiast. I tend to boost funny stuff.