genshiryoku

genshiryoku t1_j55pz1h wrote

Scaling up transformer models like GPT aren't going to result in AGI. Almost every AI expert including the researchers working at OpenAI agree with this.

We need a new architecture, one with both short term and long term memory, multi-modality and less need for training data for us to reach AGI.

The current path of scaling up transformer models will stagnate at GPT-4 or GPT-5 because we simply don't have enough data on the collective internet for us to keep scaling it further than that.

27

genshiryoku t1_j18aj4q wrote

I really like your comment.

The curated website list you describe is how things used to work with places like Jeeves.com before google took off. I work in the industry long enough to have worked with that.

I don't think we'll return to curated lists on search engines. Instead I think the Search Engine itself will be disrupted and slowly go away. Most junior software engineers working under me seem to be using ChatGPT as a google. Both asking for information, solutions and explanations to bugs in their code, highly preferring it over something like google+stackoverflow.

I think this is a sign towards where things are headed. Search Engines are probably going to get replaced with "AI-search" that probably will get a better name. Essentially like how elderly people already use google right now. Just ask the AI something in human language and the AI will give all relevant information. It's likely that visiting websites directly won't be done by humans at all and instead the AI is going to go and extract all relevant information, make screenshots and videos of whatever you want to find/read/consume.

This is probably going to be web 3.0. A human interfacing with the AI and themselves never actually using the internet and instead all internet traffic goes through this AI middleman. I could see this happening over the next 5-10 years time.

7

genshiryoku t1_j189nmi wrote

Something related to your area of expertise that has a physical dimension to it. Like actually having to physically be doing something for SMM.

If you're upskilling be sure to think about this physical factor. For example in the IT sector embedded software engineers have a leg up on general software engineers because there is a physical component to it which is hard(er) to automate away.

If you're thinking about continuing the SMM way then be sure to specialize in more abstract/unconventional ways that add a physical dimension to it. So not just merely thinking about typed text through social media or written strategies.

I'm not an expert on marketing though so you yourself will have to think and find out on your how own how to navigate this disruption.

1

genshiryoku t1_j185rdk wrote

I think Search Engines themselves are most likely going to implode or disrupted.

A lot of people have switched to ChatGPT for explanations and questions that normally would have been the area of Search Engines.

So instead of thinking about ChatGPT populating search engines with SEO techniques and clickbait. I think they'll cut out the middle-man and internet searches will primarily be done through AI conversations.

I say this as a very senior person working in IT. You are right about questioning your job security. Think about how SEO would work in an AI search world. The only viable technique I can think of right now is to populate the training set (The internet) with terms to such a degree that the trained AI model makes correlation between certain terms and whatever you want to optimize for for marketing reasoning. Although that would be considered a bug by the AI designers and thus not a long term strategy.

I will give you the same advice I'm currently giving to junior software developers: Try to switch careers away from digital manipulation of data and add a physical dimension to your job if you still want to have a career in 5-10 years time.

8

genshiryoku t1_j02mzn4 wrote

Slow burn, not because it isn't capable enough to automate away a large swath of jobs. But because society is slow at adoption.

The biggest problem the economy faces right now is that there are too many "bullshit jobs" and that companies aren't cutting out jobs but irrationally retain them.

The story about jobs being rapidly automated away is largely a myth.

0

genshiryoku t1_j02mo0h wrote

It has to do with regulation, not ability. Medical field requires a lot of red tape before new technology is allowed. This is actually a good thing because it can be a matter of life or death for a lot of people.

It takes between 20-30 years from technical viability to actual implementation in the medical field. I'd expect the first very serious systems to enter the field in about 10 years time.

3

genshiryoku t1_izxlmiv wrote

Short answer: No.

The big innovation with ChatGPT wasn't the LLM (which was still GPT-3). It was the interpreter and memory system at the front end that understood better what people asked of it.

LLM also have been trained on the vast majority of publicly available text. It's only going to become harder to train them as the data to train them on becomes the bottleneck.

2

genshiryoku t1_iy3x6rq wrote

Yeah Moore's Law has essentially ended. The fastest silicon computers we'll ever build will only be 1 or 2 orders of magnitude faster than current computers.

We need to find out smart ways to conserve computing power like making an AI render images less accurately or faking complexity in other ways.

Unless we move to graphene CPUs or any other substrate different from Silicon we'll probably never get very good VR.

0

genshiryoku t1_ixzjw1w wrote

Every IT specialist I know including myself is trying to look into switching careers. The end of the IT sector's writing is on the wall. This is like being a specialist working in a car factory in 1970s Detroit.

I feel bad for the couple of IT workers that seem to be in denial because they are going to be the ones hit the hardest as they have no Plan B set up.

12

genshiryoku t1_iwb2afh wrote

This is misinformation spread by conspiracy theorists. All seeds sold on the open market are already intellectual property. Farmers don't collect seed manually to re-grow a crop because that isn't cost effective.

Monsanto engineered the crop on purpose to not produce seeds because it would save the farmers time and money. I should also note that monsanto seeds are more cost effective than alternatives because you can save a lot of money by not having to spray as much pesticides and use less fertilizer. Saving a lot in terms of both labor and resources. It's also better for the environment.

−14

genshiryoku t1_ivebzv9 wrote

Non-minds can already talk through things like GPT-3.

In the future these models will get more complex and more human sounding despite not actually having a mind.

This will continue until the point when there is a real digital mind but people at that point won't consider reasonable human dialogue to be a sign of it anymore. Hence there won't be a reason for people to consider it to be sentient just because it can have a reasonable conversation with you.

8

genshiryoku t1_itzp60m wrote

The main reason I don't believe in UBI roll-out is that most jobs in existence right now are already bullshit jobs that could be removed without productivity loss.

Jobs are primarily to keep people occupied and busy, not to actually provide productivity gains.

It's more than likely that AGI will make it possible to replace all human labor but everyone is still employed through bullshit jobs that contribute nothing to society besides keeping everyone occupied.

Lockdowns proved that most jobs aren't essential and nothing gets lost when they aren't done.

2

genshiryoku t1_ittzimx wrote

Japan isn't a true capitalist society. For example Japanese companies don't prioritize profit, they prioritize status.

A good example of this is to how the west judges a successful company versus Japan. In the west a company success is based on their Market Cap, so essentially their stock valuation. In Japan a company success is based on the amount of employees they have. Since it's assumed that they contribute more to Japanese society by "taking care" of that large amount of people.

Startups in the west try to gain as much valuation as possible. Startups in Japan try to gain as much employees as possible.

During recessions Japanese companies refuse to fire employees because it would mean losing face and prestige for the company, instead the CEO and managers all take pay cuts if necessary, sometimes even selling personal stocks or their homes to ensure they don't fire anything since the main purpose of companies is to provide jobs, not to be financially solvent.

There's also a sense of "stability is the most important thing" in Japanese culture. This is why we have a saying that roughly translates to "It's more honorable to fail while doing what you know, than it is to succeed by innovation". This is why Japanese companies rarely innovate. The stability of doing something you know is highly priced over doing something new that removes this stability, even if it leads to something better.

All of this combined means that most Japanese people, including young people think of jobs as something sacred. Jobs are already not tied to productivity here, it's more a social function.

If Japan would get an "UBI" it would be in the form of guaranteed employment for everyone, the employment wouldn't have to be productive but it would need to have a sense of stability and community improvement for it to work. Something like everyone having to make the neighborhood more beautiful and clean and nice to live in 8 hours a day.

9

genshiryoku t1_ittpmdf wrote

I'm Japanese and everyone here knows we're just going to build more bullshit jobs once automation hits. Why? Because we're already doing so and we don't have a proper welfare system because the "welfare system" is just bullshit jobs being created like Fax machine operator where you are just trying to look busy while not actually doing anything.

We all know they are bullshit jobs but we still pretend like it's productive because that's a part of our culture.

I think all serious jobs will be automated over the next 10 years, bt I think bullshit jobs will take the place of everything else.

10

genshiryoku t1_itd9dpk wrote

The point of no return is classified as 4C warming by the International Paris Climate Accord (IPCC).

To give you some indication we're currently between 0.8 and 1.2C warming and we're projected to reach 1.5C by 2030 and 2.0C by 2050. If we keep polluting at 2022's rate permanently then we'll reach 4C warming between 2250-2300.

So while climate change is a really bad proposition we're most likely not going to reach a society ending treshold unless humanity is so stupid it never curbed emmissions before 2250.

10

genshiryoku t1_irqojfy wrote

It is. Historically change in wealth distribution never came from technological progress alone. It was always a social movement were people fought and gave their lives for the liberties and relative equality they have.

We're not going to get a post scarcity distributed economy out of itself. Lots of people are going to die to fight for that right which is going to take decades and the transition is going to be extremely bloody.

4

genshiryoku t1_irmxqml wrote

There seems to be forming a consensus that we don't need any better technology than we already have right now.

If for some reason hardware stops today and no new things will ever get made, it's possible that with the right architectural/software breakthroughs we could still reach AGI.

Yeah moore's law is most likely going to end around the end of this decade, but we have more than enough processing power for the AGI revolution to still happen.

9

genshiryoku t1_ir98hyz wrote

100% agreed. Call me back when a large model demonstrates positive transfer between completely different skills. That is when I become convinced AGI might happen within the decade.

As long as there is no proof of positive transfer it's just going to stay very cool and powerful narrow AI.

Papers like GATO shows that positive transfer might be impossible with our current AI architectures so AGI probably requires a large breakthrough. We can't simply ride the current wave of scaling up existing architectures and arrive at AGI.

11