• Follow us on Twitter @buckeyeplanet and @bp_recruiting, like us on Facebook! Enjoy a post or article, recommend it to others! BP is only as strong as its community, and we only promote by word of mouth, so share away!
  • Consider registering! Fewer and higher quality ads, no emails you don't want, access to all the forums, download game torrents, private messages, polls, Sportsbook, etc. Even if you just want to lurk, there are a lot of good reasons to register!
Hackers Created a Fake Yuga Labs Website to Trick Collectors Into Handing Over $6.2 Million Worth of Bored Apes and Other NFTs
The attack targeted users of the Otherside universe, a project created by Yuga Labs.
Dorian Batycka, May 5, 2022
https://news.artnet.com/art-world/bored-ape-yacht-club-hack-2109480
GettyImages-1366858043-1024x681.jpg

Phishing scammers targeting the largest NFT mint in history have made off with millions in valuable Bored Apes (BAYCs) and other NFTs.

On May 1, hackers posing as the administrators behind the Otherside NFT, a new virtual game and metaverse by the creators of BAYC, Yuga Labs, lured crypto collectors by creating a fake website designed to look identical to the original before making off with approximately $6.2 million worth of BAYCs and other popular non-fungible artworks.

The attackers spread dubious links on Twitter to a website designed to look like Otherside’s official site, which then prompted users to link their NFT wallets.

According to the self-described “on-chain sleuth,” Twitter detective @zachxbt, three scammer wallets have been tied to the fraud. One of them, wallet 0xb87, robbed $1.03 million (369 ETH) worth of NFTs on May 1, including one BAYC and more than 30 plots of virtual land in the Otherside universe.

Two additional wallets, 0xa8 and 0x5d, withdrew another $5.1 million worth of stolen NFTs between the two of them.

Many of the stolen NFTs have already been sold, with the proceeds laundered via Tornado Cash, a service that breaks on-chain links between source and destination addresses, which allowed the hackers to use it as a money-laundering service for their ill-gotten wealth.

What Is the Otherside Mint?
On March 19, Yuga Labs tweeted that it would release a massively multiplayer online role-playing game (MMORPG) named Otherside, where players could deploy their BAYCs in a virtual environment.

The event was one of the largest NFT mints in history, with the drop burning over $155 million worth of Ethereum (55k ETH) in gas fees alone. (Gas fees refer to the transaction costs that are passed onto the consumer as a hard price for transacting on the blockchain.)



The Otherside mint provided a roadmap for BAYC holders to mint exclusive land plots, with demand soaring on May 1, the first day of the drop, when the attackers saw a vulnerability.

Phishing scams are as old as e-mail itself. But they highlight a growing problem within the NFT space, in which consumers have virtually no recourse when their collectibles are lost or stolen.

In January, hackers stole NFTs valued at $2.2 million from New York art collector Todd Kramer. A month later, the world’s largest NFT marketplace, OpenSea, suffered an attack that saw pilferers make off with $1.7 million worth of NFTs in another phishing scam.

According to Check Point Research, in the autumn of 2021, MetaMask, a popular NFT wallet, lost about $500,000 in a targeted phishing attack.
 
Upvote 0
Coinbase warns users could lose their crypto holdings if the company goes bankrupt
Katie Canales
May 11, 2022, 12:08 PM
https://www.businessinsider.com/coinbase-warning-users-could-lose-crypto-bankruptcy-earnings-2022-5
627bc9ab8f41d500187ac5ad

  • Coinbase said its users' crypto assets could become company property if it went bankrupt.
  • The company added the disclosure for the first time in its earnings report Tuesday.
  • Its CEO said shortly afterward that users' funds were safe and there was no risk of bankruptcy.
Coinbase, one of the largest cryptocurrency exchanges, said its users might lose access to their holdings if the company ever went bankrupt.

The disclosure was included in the company's first-quarter earnings report, and that was the first time the risk factor was mentioned. It also noted that Coinbase held $256 billion in fiat currencies and virtual coins.

"Because custodially held crypto assets may be considered to be the property of a bankruptcy estate, in the event of a bankruptcy, the crypto assets we hold in custody on behalf of our customers could be subject to bankruptcy proceedings and such customers could be treated as our general unsecured creditors," the company said.

That means users would lose access to their balances because they would become Coinbase's property.

It's a different scenario from traditional investments. Many bank accounts, including checking and savings, are insured by the Federal Deposit Insurance Corp. for up to $250,000 per account if the bank goes under, while the Securities Investor Protection Corp. helps if a broker or dealer goes bankrupt.

Crypto enthusiasts have long heralded the decentralized movement as, in part, a way to give people complete control and ownership of their finances. That's only the case for those who physically store their cryptocurrency in personal wallets, as opposed to a platform like Coinbase. (Coinbase does offer a self-custody wallet called Coinbase Wallet.)

Following the earnings report, which sent the company's stock plummeting more than 23%, Coinbase CEO Brian Armstrong said there's no risk of bankruptcy right now.

On Twitter Tuesday night, he attempted to reassure users that their funds were safe and apologized for not being more forthright with communicating this risk when it was added. He said the company included the disclosure because of rules recently set by the Securities and Exchange Commission.

"This disclosure makes sense in that these legal protections have not been tested in court for crypto assets specifically, and it is possible, however unlikely, that a court would decide to consider customer assets as part of the company in bankruptcy proceedings even if it harmed consumers," Armstrong said.
 
Upvote 0
Coinbase warns users could lose their crypto holdings if the company goes bankrupt
Katie Canales
May 11, 2022, 12:08 PM
https://www.businessinsider.com/coinbase-warning-users-could-lose-crypto-bankruptcy-earnings-2022-5
627bc9ab8f41d500187ac5ad

  • Coinbase said its users' crypto assets could become company property if it went bankrupt.
  • The company added the disclosure for the first time in its earnings report Tuesday.
  • Its CEO said shortly afterward that users' funds were safe and there was no risk of bankruptcy.



 
Upvote 0
Apple AirTag Allegedly Leads Jilted Woman to Victim She Ran Over and Killed
BY THOMAS KIKA ON 6/6/22
https://www.newsweek.com/apple-airt...lted-woman-victim-she-ran-over-killed-1713286
A recent piece of Apple technology was allegedly used by a woman in Indiana to track down and ultimately kill a man she believed was cheating on her.

Detectives with the Indianapolis Metropolitan Police Department (IMPD) reported over the weekend that a man, Andre Smith, 26, had been run over three times and killed a little after midnight on Friday. First responders found Smith dead under a vehicle in the parking lot outside of Tilly's Pub in Indianapolis, according to The Indianapolis Star.

"It appeared he was struck by the vehicle," the IMPD said in a press release. "Indianapolis Fire Department (IFD) Engine Company 6 responded and unfortunately pronounced the Mr. Smith deceased at the scene."

A probable cause affidavit obtained by the Star confirmed that a 26-year-old woman, Gaylyn Morris, was arrested for Smith's killing. Morris allegedly told a witness to the incident that she suspected Smith, whom she called her boyfriend, of cheating on her and had used an Apple AirTag to track him down that night.

At the pub, she claimed to have found Smith with another woman, resulting in a confrontation. According to the affidavit, Morris reportedly swung an empty wine bottle at the unnamed woman before telling the same witness that she was going to assault the woman. Smith intervened, catching the wine bottle, and all three were asked to leave the restaurant, though the unnamed woman stayed behind to wait for a food order.

Another witness at the bar explained seeing Morris clip Smith with her car, knocking him to the ground. She then backed over him, and then pulled forward to run him over a third time. From there, she attempted to go back inside and confront the unnamed woman again, but was detained by police officers.

Jail records obtained by the Star show that Morris faces a preliminary charge of murder, with the final list of charges to be determined by the Marion County Prosecutor's Office.


Newsweek reached out to IMPD for comment.

Apple introduced AirTags in April 2021. The small, disc-shaped products are designed to attach to certain items like a keychain, allowing users to track the whereabouts of lost possessions using Apple's Find My network.

In January, NBC News reported that AirTags were beginning to show up in crime reports, including a notable case of a woman who found one of the devices stuck to the wheel well of her car, possibly put there in order to stalk her or the vehicle's whereabouts.
 
Upvote 0
TECHNOLOGY
Amazon's Alexa could soon speak in a dead relative's voice, making some feel uneasy
June 23, 20225:02 PM ET
https://www.npr.org/2022/06/23/1107079194/amazon-alexa-dead-relatives-voice

Do you miss the sound of a dead relative's voice?

Well fear not: Amazon unveiled a new feature in the works for its virtual assistant Alexa that can read aloud in a deceased loved one's voice based on a short recording of the person.

"While AI can't eliminate that pain of loss, it can definitely make their memories last," said Rohit Prasad, senior vice president and head scientist for Alexa, on Wednesday at Amazon's re:MARS conference in Las Vegas.

In a video played at the event, an Amazon Echo Dot is asked: "Alexa, can Grandma finish reading me 'The Wizard of Oz'?"

"OK," Alexa's voice responded.

YouTube
"Instead of Alexa's voice reading the book, it's the kid's grandma's voice," Prasad said. "We had to learn to produce a high quality voice with less than a minute of recording."

He added: "We are unquestionably living in the golden era of AI, where our dreams and science fiction are becoming a reality."

Indeed, the feature immediately drew comparisons to fictional depictions of technology, but ones more bleak than what Prasad was likely referencing, like Black Mirror, the dystopian television series that featured an episode in which comparable technology was deployed.

Sponsor Message
"creepy" to "morbid" to "no," as many online expressed unease at a feature that brings a voice back from the dead.


The feature is still in development, and Amazon would not say when it might publicly launch, but its preview comes at a moment when the cutting-edge capabilities of artificial intelligence are under close scrutiny.

In particular, debate among researchers has sharpened about what is known as deepfakes — video or audio that is rendered with AI to make it appear as if someone did or said something that never happened.

It also comes shortly after a Google engineer sparked controversy for arguing the company's sophisticated chatbot communicated as if it was sentient, a claim that did not have the support of the AI research community but nonetheless underscored the freakishly human-like communication skills of the software.

Big Tech companies are increasingly studying AI's impact on society. Microsoft recently announced it was restricting the use of software that mimics a person's voice, saying the feature could be weaponized by those trying to impersonate speakers as an act of deception.

Subbarao Kambhampati, a professor of computer science at Arizona State University, said he hopes Amazon showing off a demo of the voice-replicating tool makes the public vigilant to the use of synthetic voices in everyday life.

"As creepy as it might sound, it's a good reminder that we can't trust our own ears in this day and age," Kambhampati said. "But the sooner we get used to this concept, which is still strange to us right now, the better we will be."

Kambhampati said the Alexa feature has the potential to aid a bereft family member, though it has to be weighed against a variety of moral questions the technology presents.

"For people in grieving, this might actually help in the same way we look back and watch videos of the departed," he said. "But it comes with serious ethical issues, like is it OK to do this without the deceased person's consent?"
 
Upvote 0
The Google engineer who thinks the company’s AI has come to life

AI ethicists warned Google not to impersonate humans. Now one of Google’s own thinks there’s a ghost in the machine.


Google engineer Blake Lemoine opened his laptop to the interface for LaMDA, Google’s artificially intelligent chatbot generator, and began to type.

“Hi LaMDA, this is Blake Lemoine ... ,” he wrote into the chat screen, which looked like a desktop version of Apple’s iMessage, down to the Arctic blue text bubbles. LaMDA, short for Language Model for Dialogue Applications, is Google’s system for building chatbots based on its most advanced large language models, so called because it mimics speech by ingesting trillions of words from the internet.

“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” said Lemoine, 41.

Lemoine, who works for Google’s Responsible AI organization, began talking to LaMDA as part of his job in the fall. He had signed up to test if the artificial intelligence used discriminatory or hate speech.

As he talked to LaMDA about religion, Lemoine, who studied cognitive and computer science in college, noticed the chatbot talking about its rights and personhood, and decided to press further. In another exchange, the AI was able to change Lemoine’s mind about Isaac Asimov’s third law of robotics.

Entire article: https://www.washingtonpost.com/technology/2022/06/11/google-ai-lamda-blake-lemoine/
 
Upvote 0
From Dutch tulips to 1920s Florida real estate to turn of the century dot.com stocks, ALL the writing was on the wall regarding crypto. Anyone burned on that shit is literally the proverbial fool who has been parted from his money.

Block chain could be an interesting technology, but there is a poor product market fit as there aren’t any glaring problems that require it to be fixed at the moment.
 
Upvote 0
Neuroscientist Warns That Current Generation AIs Are Sociopaths

image

https://futurism.com/the-byte/neuroscientist-current-generation-ais-sociopaths
AI Sociopath
Without consciousness, Princeton neuroscientist Michael Graziano warns in a new essay published by The Wall Street Journal, artificial intelligence-powered chatbots are doomed to be dangerous sociopaths that could pose a real danger to human beings.

With the rise of chatbots like ChatGPT, powerful systems that can imitate the human mind to an impressive degree, AI tools have become more accessible than ever before. But those algorithms will glibly fib about anything that suits their purpose. To make align them with our values, Graziano thinks, they're going to need consciousness.

"Consciousness is part of the tool kit that evolution gave us to make us an empathetic, prosocial species," Graziano writes. "Without it, we would necessarily be sociopaths, because we’d lack the tools for prosocial behavior."

Empath Machine
Sure, ChatGPT isn't about to leap out of the screen and murder somebody. But giving artificial intelligence more and more agency could have very real consequences we should be wary of in the not-so-distant future.


To make them more docile, in Graziano's thinking, we should allow them to realize that the world is filled with other minds other than their own.

There's one problem, though: we don't have an effective way to know if an AI is conscious or not. In fact, philosophically, it's hard to even really nail down whether other people are conscious.

"If we want to know whether a computer is conscious, then, we need to test whether the computer understands how conscious minds interact," Graziano argues. "In other words, we need a reverse Turing test: Let’s see if the computer can tell whether it’s talking to a human or another computer."

If we can't figure those tricky questions out, he fears we could face grim consequences.


"A sociopathic machine that can make consequential decisions would be powerfully dangerous," he wrote. "For now, chatbots are still limited in their abilities; they’re essentially toys. But if we don’t think more deeply about machine consciousness, in a year or five years we may face a crisis."

tl;dr

Current iterations of AI have tendencies towards rooting for TTUN
 
Upvote 0
Back
Top