blueSGL

blueSGL t1_iuriw6r wrote

matrix multiplications require doing additions (and subtractions) and multiplications.

GPUs can do additions (and subtractions) faster than multiplications.

by rejiggering the way the matrix multiplication is written you can use less multiplications and more additions thus it runs faster on the same hardware.

https://en.wikipedia.org/wiki/Strassen_algorithm

>Volker Strassen first published this algorithm in 1969

.....

>In late-2022, AlphaTensor was able to construct a faster algorithm for multiplying matrices for small sizes (e.g. specifically over the field Z 2 \mathbb {Z} _{2} 4x4 matrices in 47 multiplications versus 49 by the Strassen algorithm, or 64 using the naive algorithm).[2] AlphaTensor's results of 96 multiplications for 5x5 matrices over any field (compared to 98 by the Strassen algorithm) was reduced to 95 a week later with further human optimization.

7

blueSGL t1_itsji9p wrote

I'm sure I saw a Lex Friedman interview with a neuroscientist who said that experience is a post hoc narrative of events and that you can watch the brain make decisions about choices using fMRI where the choice is fixed in before the conscious observer thinks it is. Annoyingly I can't remember who he was interviewing.

10

blueSGL t1_itsf4uj wrote

>the majority of small businesses are not buying robots to replace people in the near term.

What about small businesses that can do the work remotely? the percentage of the entire workforce who don't need to physically be present in a specific location to carry out their jobs (quick google, ranges from 1/4 to 4/10 )

and large business is already looking at automation. With control models like this six axis arm making it simple to program and to reprogram for a different task, it only needs to be slightly better than human on a cost/benefit analysis to make it worth while. (was the cost in these things to begin with the hardware or the software, I've never looked into it)

3

blueSGL t1_itscy5u wrote

it all comes down to the money in the end, if [business] can make more money by using AI it will get used.

Is there going to be enough companies left doing things 'the old way' to keep employment numbers up even though it's less cost effective?

> My grandmother still goes to the bank window to withdraw cash and has never used a computer in her life.

and yet people like her don't provide enough financial incentive to keep branches open.

https://www.bankingdive.com/news/us-banks-close-2927-branches-in-2021-a-38-jump/617594/

5

blueSGL t1_its73w0 wrote

The other confusion is you don't need a general human level AI in all fields to cost jobs, A collection of narrow AIs selected for the type of work and feeding into each other will be able to replace jobs without even looking at the larger multi model systems that are being built.

24

blueSGL t1_its6j7f wrote

> highly specialized machines.

That is of course until you can use a language model to instruct humanoid robots.

something like this: https://vimalabs.github.io/ but for a humanoid/teslabot body.

Then it's generic hardware and generic (likely fine tuned) software.

2

blueSGL t1_itlmagr wrote

thinking about [thing] necessitates being able to form a representation/abstraction of [thing], language is a formalization of that which allows for communication. It's perfectly possible to think without a language being attached but more than likely having a language allows for easier thinking.

13

blueSGL t1_it7tzvt wrote

I've already seen people generate images for their RPG campaigns using Stable Diffusion, how many rule/campaign books will a LLM need to crunch through before it can spit out endless variations on a theme for your favorite system (or act as a major accelerator for those creating them already)

Edit: actually lets expand on this.

What happens when a sufficiently advanced model gets licensed for fine tune to paying companies and Wizards of the Coast feeds in the entire corpus of data they control and starts using that to create or help create expansions and systems and the former creators shift to editors.

Now do that for every industry that has a text backbone somewhere in it. e.g. movie/tv scripts, books, comics, radio dramas, music video concepts, and so on.

5

blueSGL t1_it7gxtg wrote

> I don't foresee politicians giving up the reins easily.

they don't need to, politicians are already advised by groups they surround themselves with, if one of those advisors is AI (or more specifically for one of those human advisors uses an AI and pass off the advance as their own) and the politician gets ahead due to the advice then they'd all want to use one.

Then you have all the multi competing agent fun like flash crashes in the stock market due to high frequency trading algorithms battling it out.

Fun times ahead.

2

blueSGL t1_iszcalu wrote

> I have a bit of a theory on this actually. It's a combination of a couple things. The AI effect being the most obvious, where people will say AI can't do something, and when it does they dismiss it because it's just computer calculations. A moving goal post of sorts.

https://en.wikipedia.org/wiki/AI_effect

Also I've a feeling a lot of jobs are going to be made redundant by collections of narrow AIs you don't need AGI to replace a lot of jobs just a small collection of specialist AIs that can communicate. I wondered why the Gato paper (from what I read of it) didn't try any cross domain exercises. e.g. get a robot arm to play an atari game.

1

blueSGL t1_ispuvph wrote

>once you realize you're competing with your own open source distribution.

ain't that the truth, I'm amazed when I listen to Emad Mostaque (founder of Stability AI) talk about their implementation of Stable Diffusion and he's going on about upcoming features 'coming soon' and I've already been running them for days/weeks on the Automatic1111 fork, it's just bizarre.

16