Will there be AI scams in gambling? Top influencer Cr1TiKaL warns viewers about the dangers of AI voice scams – in gambling the tech will only continue to evolve.
At the moment, many are in awe of the capabilities being displayed by AI technology. This is perhaps because much of what is out there appears to be harmless on the surface.
However, the growing number of scams inspired by AI is becoming an increasing concern. As Cr1TiKaL states in the video we have linked below, you might report one scam, but another two will come your way.
As AI continues to advance, we are likely to reach the point where these videos can be made in minutes and distributed in seconds across various platforms. Therefore, should the rise of AI worry you?
While there is no such thing as a nice way to be scammed, the severity of stress endured can vary based on the methods used. Some criminals have been able to message parents with the voice of their children, convincing them their children are in danger. Clearly, this method is incredibly effective since parents would do anything to see their children returned safely.
Even when scammers don’t go to this incredibly dark length to pull off their scams, they are still able to fool many, as they send voice messages where their loved one appears to be asking for a simple money transfer. For example, “lend me €100, my bank is frozen, I’ll pay you back next week”, etc.
Something which is easy enough to believe in isolation. Scammers are also very aware of the demographics that are more likely to fall for such scams.
Worryingly according to Cr1TiKaL, AI is in its infancy, and it won’t take long before we outlive the wonders of what AI can do today. The evolution of the iPhone is a testament to this. Only about 15 years ago, playing a video on such a device was seen as a feat.
As humans, we quickly forget how we felt at the time, and because evolution happens in front of our eyes, we sometimes don’t give enough credit to how quickly tech has evolved.
The image and voice of MrBeast, the insanely popular YouTube streamer, is being used for a giveaway scam, something which he is noted for!
As Cr1TiKal states, this type of video should trigger numerous red flags for most people who are somewhat internet savvy. The video is not super fluent, and the delivery is not as engaging as it perhaps could be.
It is good enough to trick either children or older people, though, and if this video reaches millions, then the scam isn’t required to have a great conversion rate to be successful.
Many people in this world are either desperate for money or, at the very least, would benefit hugely from the financial awards these scams promote.
With this in mind, it is important to note that although you feel you will never fall for a scam like this, it is always useful to put yourself in the shoes of those who might.
You might have already gone down the AI rabbit hole and can see this particular example a mile off, but that is certainly not the case for everyone.
This has been a concern of mine for quite some time now, and when Cr1TiKaL alluded to it, my ears perked up.
If we get to the point where AI can make a video of someone and there is no way to distinguish it from our reality, how can we ever use video evidence to support our claims?
In some cases, people accused of committing a crime have claimed they are AI victims. Maybe hard to believe right now, but in the future, how will we know for sure that a video being released is real?
Every politician caught in bed with someone who is not his wife will have a ready-made excuse to hand. The future Prince Andrew will even get away with his crimes, oh wait!
Perhaps there will be no rule and order when this time comes since accountability will be out of the window.
Online casinos spend millions each year to ensure they have measures in place to protect themselves and those who use their platform. Online casinos are ideal places for criminals to launder their money, and underage gambling is a big concern worldwide.
KYC checks are now the norm for popular online casinos, and for now, they are incredibly effective. In essence, a great KYC service will make 99.9% sure that the player setting up a gambling account is exactly who they claim to be.
However, what happens if someone can replicate someone’s ID with no detection in the future? In cases where gambling platforms are suspicious, they might ask for a selfie or a life realness check where you have to blink or jilt your head left to right, for example.
For now, that seems like something impossible to overcome if you’re a criminal. If we reach a point in the near future where it wasn’t, though, what would be our next step?
Truth is, no one has the answer to these questions. Scams have become increasingly elaborate for years, but are we prepared for what is to come?