Connect with us

World

A Food Bank In Washington Fell Victim of a Ransomware Attack

mm

Published

on

Ransomware attacks have been mostly seen on profit-making organizations, but this time a non-profit making food bank fell a victim of a ransomware attack. Washington located Auburn Food Bank fell victim of a ransomware attack known as GlobeImposter 2.0. It has encrypted all the computers run on the Food Bank’s network. There is no clue about how the attackers have put ransomware into the network. Only one system is left without encryption which will be used for charity purpose.

Auburn Food Bank is a human empowering nonprofit making organization which offers free food on public places and runs entirely on charity. The Food Bank has denied paying the ransom demand to the attackers because the demands are very high and there is no guarantee that the crooks will decrypt all the files after getting the ransom. They asked to pay 1.2 bitcoins which is almost equal to $9,500.

To show their strength, the crooks decrypted a file, and later asked the ransom amount three times more. The food bank has chosen to change its network and systems instead of paying the unacceptable ransom. Many tools like ZoneAlarm Anti-Ransomware have today emerged as some of the best tools to deal with such crooks. And the organization is planning on implementing such anti ransomware tools in their new network and systems.

Auburn Food Bank Director, Debbie Christian, said that they are not going to pay the ransom amount and have preferred to vanish all the systems. Even the network server and mail server will also be replaced with new ones. And the plan is to build a new network and new files.

Jenny is one of the oldest contributors of Bigtime Daily with a unique perspective of the world events. She aims to empower the readers with delivery of apt factual analysis of various news pieces from around the World.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

World

Why Accidents Involving Self-Driving Cars Are So Complex

mm

Published

on

The last two decades have seen technological advancements and innovations improve tremendously. Technologies like video calling and driverless cars, which were only possible in Sci-Fi movies, are now a reality. 

Unlike some other technology faults, driverless car errors can be a matter of life and death. While there is no doubt that driverless cars are the future of driving, a lot still needs to be done before the technology can be considered safe.

They May Not Be As Safe

In the past few years, there have been several stories about vehicles on autopilot causing an accident. Some of these situations would be easily avoidable for a human driver, bringing to question the safety of autonomous features. While accidents involving cars on autopilot usually result in less severe injuries than driver-operated vehicles, a recent study shows that their rate of getting into an accident is slightly higher. 

On average, there are 4.1 crashes per 1 million miles traveled for driver-operated vehicles compared to 9.1 per 1 million miles traveled for vehicles with autonomous driving features.

Misleading Terminologies

Currently, there isn’t much regulation on autonomous driving allowances. Most autonomous car makers capitalize on the loopholes in the law to create misleading terminologies regarding vehicles’ capabilities, making determining liability a complex issue. 

For example, Tesla refers to its advanced driver-assist feature as autopilot, which drivers can interpret as entirely autonomous. On its website, Tesla states that autopilot is an advanced driver assist feature meant to complement perceptive human drivers, not replace them. Unfortunately, many semi-autonomous car drivers get a sense of false security from the misleading terminology, resulting in devastating accidents. 

Accidents that happen under such circumstances can result in Tesla having liability. Recently, a court in Germany found the “autopilot” tag on tesla vehicles misleading. This means that Tesla could be liable for damages resulting from reliance on the feature. 

Technology Malfunction

Autonomous car makers could also be liable for an accident if a malfunction in their system causes an accident. Malfunctions can result from system failure or even cyber-attacks. 

In 2015, a planned hacking test was conducted on a Jeep. Surprisingly, the hackers were able to access the jeep remotely and stop it while traveling at 70 mph. Accidents that result from system hacking could see car manufacturers having liability because system hacks are outside the driver’s control. 

Driver Liability

In January of 2022, a 27-year-old Tesla driver was charged with vehicular manslaughter for hitting and killing two occupants of a Honda Civic at an intersection while on autopilot. This case marked the first time an American was facing criminal charges for autopilot-related accidents, which could set precedence for future accidents involving autopilot features. 

“Autopilot cannot and should not replace attentive driving,” says car accident attorney Amy Gaiennie. “All drivers should keep their attention on the road and only use any self-driving assistive technology to complement their safe driving practices.”

According to the NHTSA, vehicle control lies with the driver irrespective of how sophisticated its technology is. This means that accidents that result from a driver not playing their part in operating the vehicle can see the motorist carrying liability for the accident.

As it stands, vehicles cannot be considered entirely autonomous, but technology is headed there fast. But until then, the driver must play a significant role in operating a vehicle failure to which they could be liable for damages. 

Continue Reading

Trending