Show more

@chriswheatley@mastodonapp.uk @paul try to follow any of them back and if you can’t, your server has blocked bird.makeup

@drewdevault@fosstodon.org private repos being for paid accounts only is not a bad model!

It is funny to see some very large accounts that have been consistently anti-crypto, move seamlessly into fudding ChatGPT.

Clearly there is a spectrum of human responses in terms of openness to novelty.

The same ofc applies to us. We are more open than average to new ideas.

Any experienced programmer worth their salt will tell you that •producing• code — learning syntax, finding examples, combining them, adding behaviors, adding complexity — is the •easy• part of programming.

The hard part: “How can it break? How will it surprise us? How will it change? Does it •really• accomplish our goal? What •is• our goal? Are we all even imagining the same goal? Do we understand each other? Will the next person to work on this understand it? Should we even build this?”

Show thread

I think I figured it out, currently sending backlogged tweets!

Show thread

bird.makeup is having some trouble with the current load. Looking into it!

@linuxgamingcentral@mastodon.social oh nice! Didn’t have to wait long for this

Lifecycle CO2 g/km, Nissan Leaf: 104
Lifecycle CO2 g/km, ebike: 22

Electric cars sold across all of Europe, 2022: 1.5-1.6 million
Ebikes sold just in Germany, 2022: 2.2 million

Panasonic 2170 battery cells, Tesla 3 (short range): 2,976
Panasonic 2170 battery cells, Malibu GT ebike: 65
(= 45 ebikes : 1 Tesla 3—or 140 ebikes : 1 Tesla X)

Cars parked per parking space: 1
Ebikes parked per parking space: ~10

someone who is good at math please help me budget this. my city is dying

With all the GPT4-will-change-everything hype and fanfare happening now, it is worth mentioning that I just had to deal with some rural contractors that DIDN’T HAVE EMAIL ADDRESSES. The system of the world has more inertia than it sometimes seems.

People have a hard time defining woke without accidentally saying they want to be able to hate minorities and not face any consequences.

@avocado good to hear! Will close the bug soon if no one notices that problem anymore

@mlanger and leave out blind users? That’s a strong take!

I wonder if someone will eventually harvest the (high quality) image descriptions of the fediverse for AI training? 🤔

"Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar." #gpt4

Join us at 1 pm PT today for a developer demo livestream showing GPT-4 and its capabilities/limitations: youtube.com/live/outcGtbnMuQ?f

(comments in Discord: discord.gg/openai)

Show thread

I don't think people realize what a big deal it is that Stanford retrained a LLaMA model, into an instruction-following form, by **cheaply** fine-tuning it on inputs and outputs **from text-davinci-003**.

It means: If you allow any sufficiently wide-ranging access to your AI model, even by paid API, you're giving away your business crown jewels to competitors that can then nearly-clone your model without all the hard work you did to build up your own fine-tuning dataset. If you successfully enforce a restriction against commercializing an imitation trained on your I/O - a legal prospect that's never been tested, at this point - that means the competing checkpoints go up on bittorrent.

I'm not sure I can convey how much this is a brand new idiom of AI as a technology. Let's put it this way:

If you put a lot of work into tweaking the mask of the shoggoth, but then expose your masked shoggoth's API - or possibly just let anyone build up a big-enough database of Qs and As from your shoggoth - then anybody who's brute-forced a *core* *unmasked* shoggoth can gesture to *your* shoggoth and say to *their* shoggoth "look like that one", and poof you no longer have a competitive moat.

It's like the thing where if you let an unscrupulous potential competitor get a glimpse of your factory floor, they'll suddenly start producing a similar good - except that they just need a glimpse of the *inputs and outputs* of your factory. Because the kind of good you're producing is a kind of pseudointelligent gloop that gets sculpted; and it costs money and a simple process to produce the gloop, and separately more money and a complicated process to sculpt the gloop; but the raw gloop has enough pseudointelligence that it can stare at other gloop and imitate it.

In other words: The AI companies that make profits will be ones that either have a competitive moat not based on the capabilities of their model, OR those which don't expose the underlying inputs and outputs of their model to customers, OR can successfully sue any competitor that engages in shoggoth mask cloning.

"We offer no explanation as to why these architectures seem to work; we attribute their success, as all else, to divine benevolence" - Noam Shazeer (second author of the transformer paper, now CEO of Character AI)

from the SwiGLU paper: arxiv.org/abs/2002.05202v1

Show more
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml