hunter2_ 4 hours ago

I was looking at laptops recently, and I noticed that the marketing blurbs on product pages are really getting extreme with the AI stuff in a weird SEO-like way. For example, this is on Costco's page for an Acer:

> AI Ready for Tomorrow

> Ready for the ever-evolving possibilities of AI? This Swift Go AI PC integrates Intel’s new dedicated AI engine—Intel® AI Boost—with Acer’s own AI solutions, for more intuitive and enjoyable AI experiences.

It really seems like some kind of stupid joke.

  • latentsea 12 minutes ago

    There is absolutely zero meaning that can be derived from that marketing blurb.

  • xdavidliu 3 hours ago

    > pages are really getting extreme with the AI stuff in a weird SEO-like way

    you mean in a weird generative AI way?

    • tdeck an hour ago

      "Certainly. Here is a new blurb that promotes the product while using the term AI four additional times."

galleywest200 4 hours ago

> Lack of familiarity with AI PCs leads to what the study describes as "misconceptions," which include the following: 44 percent of respondents believe AI PCs are a gimmick or futuristic; 53 percent believe AI PCs are only for creative or technical professionals; 86 percent are concerned about the privacy and security of their data when using an AI PC; and 17 percent believe AI PCs are not secure or regulated.

Is being concerned about your privacy really a misconception here?

  • zeta0134 4 hours ago

    What would be the point of an "AI PC" if not to run models locally? I'm very much uncomfortable with sending my keystrokes (or my codebase) off to some remote server. If I can run the model locally, the privacy problems in theory vanish and I'm much more likely to use the tech.

    If folks don't understand that, then yes I'd say it's a pretty big misconception and needs to be better marketed as a key feature.

    And if the AI PC is just a regular PC with a cloud bot integrated, then ... what even is the point? You can already do the remote chatbot thing with a regular PC, privacy nightmares included!

    • defnotai 4 hours ago

      Access to the latest foundation models, which can’t be run locally. AI feels like it’s in this really weird place where the latest Claude model sets expectations that can’t be matched by an on-device model.

      Even Apple Intelligence is getting a lot of negative feedback by the review crowd due to its limitations like being unable to summarize a very large document (which is pretty much the point of such a feature).

      The problem is that AI has few well-defined use cases and a mountain of expectations, and this really shows in the execution by these companies. It’s hard to build good products when the requirements are “we don’t really know”

  • kardos 4 hours ago

    Do the AI PCs do the work locally or transmit everything remotely for processing like chatgpt?

  • add-sub-mul-div 4 hours ago

    It's propaganda, as the article points out: "The chipmaker, which is quite keen to see people buy the AI PCs sold by its hardware partners,"

  • Dalewyn 4 hours ago

    It's also not a misconception that "AI" PCs are for creative/technical professionals when the vast majority of the marketing is about shitting out creative(?) or technical(?) works(?) to reduce and ultimately democratize that workforce.

    • dr_kiszonka 3 hours ago

      "Democratize" used to have positive connotations, as in "democratize access to information". I didn't anticipate that it would have negative ones.

Nevermark 4 hours ago

This is a natural phase of major tech breakthroughs. It is not a negative indicator. The opposite.

It is normal for most tech to have an exploratory period, where its potential is clear, but its immediate economic impact is negative during iterations of product-market fit adaptation.

Normally, the producer of new tech eats most or all of the risk and cost of the search for product-market fit.

But some tech is so compelling, that customers feel the strategic need to participate in the discovery loop too.

Obviously, there are upfront costs and risk deploying/trying tech that is still hit-and-miss. But during a sea change, there is also risk in not experimenting and adopting/adapting early.

A1kmm 3 hours ago

I find the concept of "AI PC" to be somewhat nonsensical in the absence of a definition that is about the hardware.

Just working out the age of my personal desktop computer has a Ship of Theseus problem - but safe to say 20+ years. However, it now has a graphics card with an RTX 3060 with 12 GB of GPU, and NVMe SSDs, and can run inference on 7B parameter 4 bit quantised Transformer LLMs, and generate images with large diffusion models. I've also used it for many applications that would count as AI before the latest generative AI hype cycle.

So is it an AI PC? At what point did it become an AI PC? Or is a self-built machine in which you swap parts inherently never an AI PC?

Given the fact that it is so amorphously defined, I would consider the term to be purely marketing fluff.

  • safety1st 2 hours ago

    Assuming AI means LLM, at this stage I've come across two broad categories of implementation that are actually interesting and useful to me as a user.

    1) A box on the screen where I can chat with one to do ideation or really anything I want.

    2) A command-driven approach where I hit a hotkey, type a prompt and the response is dumped out in front of me, possibly I had some text selected which heavily influences the response.

    These are both pretty cool tbh and developers will have a field day for years finding sensible ways to incorporate them into programs.

    None of this has anything to do with driving the hardware upgrade cycle since most of the models are running in the cloud. But driving hardware upgrades is what these marketing people are really trying to do when they talk about AI PC. They are irrelevant people, but they need to convert everything they see into a reason to buy a new PC.

    That's what they get paid for. Monkey marketer see trend, monkey marketer do marketing. Monkey steal your attention.

    Maybe a LLM will replace THEM soon. After all it's basically a digital version of the million monkeys on typewriters...

  • hunter2_ 2 hours ago

    I think we just have a tendency to anthropomorphize things that are a bit too complicated to understand fully. Like a child calling the clutch mechanism in a yo-yo a "brain" for example. It's not that the yo-yo can really think, it's just that it has a behavior that seems that way. So indeed once you've upgraded your system to the point of not fully understanding how something that isn't a human could achieve whatever emergent behavior occurs, go ahead and anthropomorphize it by calling it AI.

    There's not a specific line in the sand, although tasking it with machine learning (in which outcomes improve based on collecting runtime inputs, rather than based only on its creator adding capabilities) would be a decent one. That's fairly human-like, while non-ML workloads are more plant-like.

maeil 3 hours ago

> The chipmaker, which is quite keen to see people buy the AI PCs sold by its hardware partners

Hah, no they very much aren't, as Intel are an insignificant player at the moment. As long as Intel is as far behind as they are, they'd rather overall investment in hardware goes down. The second Intel comes out with a leading chip, you'll suddenly see them come up with a study with the opposite result.

rileymat2 3 hours ago

I am not an AI booster, but I would expect a learning curve slowing down current tasks for pretty much any technology that needs to be learned.