To main heading
Headshot of Patanjali Sokaris

Pondering the universe

Science & technology

AI is the new SEO

Tech has its fads, and with each we get a plethora of talking heads ever-willing to enlighten us.

SEO is not dead yet but unless a site is popular, Google will ignore over 90% of it, making SEO largely irrelevant for the huge majority of site owners. The new tech interest is AI, and YouTube has been inundated with it. An example is Microsoft's Copilot, which has many all too willing to tell us about it, but all neglect to tell us that it is not available for anybody unless they are part of a company with an enterprise contract with Microsoft.

And this is the issue with AI, because while it is supposed to be something that will affect all our lives, it seems it will mainly affect those in big companies that want a lot of rehashed information, or produce a lot of it. The ones who will likely be most affected by being replaced by AI are the 40% who think their jobs are useless. Any industry that relies upon variations of what they produced in the past will be most affected, like the entertainment industry, where rehashed franchises are becoming the norm. They are ripe for an AI takeover, including of their lazy executives.

Educationalists are worried about students using AI to complete assignments, and so they should, just because they are expecting them to do fake tasks and so why wouldn't they use AI instead? After all, many are plugging AI for use in business doing exactly the same extraneous tasks in business. It is clear that AI should be handled educationally by training people how to specify tasks to be done, by AI or people. That is, education should now be directed towards accurate and directed goal and task setting, rather than actually doing them.

Too many of the commentators are so enamoured with what the technology can do, but fail to see what it really means. For most people, AI will just become another form of entertainment and distraction, and for those who have to actually use it for work, it will be just another dubious tool that they have to use to get paid. Just another step in a technological merry-go-round that promises liberation from tedium, but just ends up causing more of it.

There is an old joke about a person who applies for a job where they are told about a monkey that will do a whole lot of interesting tasks. When they enquire what their job is, they are told it is to feed the monkey. Well, that is basically what people end up doing: feeding the latest technology monkey. The technology is supposedly getting smarter, but we are making up for that by getting dumber, and still the perennial promise of more leisure time fails to materialise, except by being made redundant, and then being victimised for being a drag on society.

Take the promises of any so-called technological revolution with a grain of salt, as all it does is change the medium by which we are enslaved. The increasing complexity of technology hides that most people do not really understand the technology that they use. The technological advances are always largely hidden from us so that we think we are more advanced because we are told so, but nothing much really changes with how sophisticated our interactions with it are. That is by it being dumbed down for us, rather than us increasing our understanding of it.

Solve communication and info flows

AI needs guidance to be useful, so the means of specifying and imposing that guidance needs to be defined.

Besides the AI model used itself, the source of its data and how it is queried need to be specified. Sourcing data has become contentious because to train the AI used for search engines, their bots have been scraping massive amounts of internet data, much of it proprietary. Thus there has been a lot of pushback because of these copyright violations.

But using any data can have its downsides, because sites such as Stack Overflow has had an influx of well-written but erroneous answers generated by AI. This is because not everybody has the experience to write properly-working programs, and AI does not know how to actually distinguish them (other than in their syntax and its implications), so it is a crap-shoot as to whether the results are useful or not. Therefore, the datasets used to train AI and what is added to them over time has to be restricted to what is true for the types of queries to be asked of it.

Even generative AI will not be as productive if the source content is not suitable for allowing the types of permutations wanted. Feeding in a bunch of spaghetti westerns will only result in spaghetti westerns out. Movies created on demand based upon a few prompts may be promising, but people will soon get sick of them if they were only fed on the vast back-catalogues of C-grade movies.

An oft-cited maxim of data flow is that garbage in will result in garbage out, but what comes out may be derived from correct data, but be of dubious value, either because it contains too much extraneous information, or it is misleading. Just like how useful search engines are depends upon the quality of the query words used, the same applies to AI queries, even if the data is not sourced from the internet. The better people are at communicating their intent and desires, the more relevant query results will be.

What these two issues highlight is that if data gathering and query framing were improved, the AI would perform better. Currently, AI seems to be aimed at trying to give something useful to people who cannot do either of these tasks well. This begs the question of whether AI is the best use of resources, and we might be a lot better off if we taught people how to find relevant data and ask meaningful questions, because that may enable us to properly build our information systems, together with meaningful ways of extracting that information.

A sarcastic maxim goes that being idiot-proof means only idiots will use it, but that is what much of our so-called complex systems are in effect being promoted to be used by. In essence, as mass consumers of information, we are being shepherded away from being in control and masters of that information to it being just another form of self-serve ice-cream. This process gives false justification for tech companies to invest the world resources heavily in AI, and thus justify their own existence, while pretending to us that they are helping us.

For any enterprise, it may be better to hone information and reporting flows so that the types of information actually needed is available when needed. Search engines would be useless if their source information was actually constructed in the way that they infer. People rely upon the results pointing to well-constructed sources, rather than bunches of random words, no matter how many monkeys typed them in. Storing information in better-structured ways will vastly improve access to it and its usefulness than fancy ways to con it out of jumbled messes.

In a way, AI is just another way to avoid taking responsibility for being organised. It is an illusion that actually relies upon being highly organised and constructed underneath. Given the amount of resources required just to access information, it is probably the most inefficient way of doing it. People learning how to structure and extract information is a useful ongoing skill which bypasses having to enlist copious technology to bypass that learning. A final maxim is that it is better long-term to teach a person to plant fields than to provide food for them every day.

  • Using an 8K TV as a monitor
  • Audio recording
  • Outsourcing
  • Contact   Policies
  • Categories   Feed   Site map

  • This site doesn't store cookies or other files on your device when visiting public pages.
    External sites: Open in a new tab or window, and might store cookies or other files on your device. Visit them at your own risk.
    Powered by: Smallsite Design©Patanjali Sokaris