Tuesday, November 30, 2021

Fisher Price Chatter Telephone with Bluetooth

Introducing the special edition Fisher-Price® Chatter Telephone™ — a phone smart enough not to come with any apps. Its intuitive bulky face design comes with a 'super-advanced' rotary dial and connects to your mobile device via Bluetooth® wireless technology, so you can make and receive real calls through your existing phone plan. This working Chatter Telephone™ is so mobile, it even comes with wheels. Plus, it has grownup functionality like speakerphone and the ability to dial out. Your childhood is calling, now you can actually answer. Available exclusively at Best Buy® while supplies last.



from Hacker News https://ift.tt/3mIIcLp

Advent of Code 2021

--- Day 1: Sonar Sweep ---

You're minding your own business on a ship at sea when the overboard alarm goes off! You rush to see if you can help. Apparently, one of the Elves tripped and accidentally sent the sleigh keys flying into the ocean!

Before you know it, you're inside a submarine the Elves keep ready for situations like this. It's covered in Christmas lights (because of course it is), and it even has an experimental antenna that should be able to track the keys if you can boost its signal strength high enough; there's a little meter that indicates the antenna's signal strength by displaying 0-50 stars.

Your instincts tell you that in order to save Christmas, you'll need to get all fifty stars by December 25th.

Collect stars by solving puzzles. Two puzzles will be made available on each day in the Advent calendar; the second puzzle is unlocked when you complete the first. Each puzzle grants one star. Good luck!

As the submarine drops below the surface of the ocean, it automatically performs a sonar sweep of the nearby sea floor. On a small screen, the sonar sweep report (your puzzle input) appears: each line is a measurement of the sea floor depth as the sweep looks further and further away from the submarine.

For example, suppose you had the following report:

199
200
208
210
200
207
240
269
260
263

This report indicates that, scanning outward from the submarine, the sonar sweep found depths of 199, 200, 208, 210, and so on.

The first order of business is to figure out how quickly the depth increases, just so you know what you're dealing with - you never know if the keys will get carried into deeper water by an ocean current or a fish or something.

To do this, count the number of times a depth measurement increases from the previous measurement. (There is no measurement before the first measurement.) In the example above, the changes are as follows:

199 (N/A - no previous measurement)
200 (increased)
208 (increased)
210 (increased)
200 (decreased)
207 (increased)
240 (increased)
269 (increased)
260 (decreased)
263 (increased)

In this example, there are 7 measurements that are larger than the previous measurement.

How many measurements are larger than the previous measurement?



from Hacker News https://ift.tt/3EdfbxR

What I learned about interviewing

This was my first rodeo through the ride that is tech interviews. I had escaped the necessity that is formal technical interviews previous to this point. Oh buddy, did I end up with scabby knees. It was a steep hill to climb, especially when my previous job didn’t necessitate any of the skills they were interviewing on (pair programming with strangers, system design, etc.). It’s been said before, but interviewing is a skill in of itself; if you’re bad at technical interviews, it doesn’t have strong correlation to whether you’re a good programmer. This leads me to my first point about interviewing that I learned.

  1. The company shouldn’t only be looking at whether you can deliver a correct solution.

By this I mean that it should be obvious that they’re also screening for how pleasant you are of a person to work with on technical problems. If they’re not, well, it’s probably not a place you really want to work at (your coworkers aren’t guaranteed to be – let’s say fun – to work with). And this makes the technical interview so much less daunting imo, because you don’t have to get 100 percent correct to pass; you have to demonstrate that you have active listening skills (meaning take hints from the interviewer while iterating on your solution), and be able to verbally communicate your thought process when working through a solution.

In the end, though, you will have to learn what they’re looking for in a candidate, and be prepared for the format of the interview. It’s extremely difficult to practice for this because the live nature of these interviews is what makes it hard.

  1. Always cringe forward, not backward.

If you can, schedule your interviews in the order of importance for what job you think you might want an offer for. This isn’t to say that you should in any way not prepare for those earlier interviews; you might realize too late that the importance you’ve assigned each potential job. But you should be strategic so that you can approach each mistake as an opportunity for learning.

Despite my preparations, I fell on my face during my first systems design interview. Granted, I had had only a few days to study, but I had no feel for how the flow of the interview was supposed to go. One could argue that if you know how to design a system, you’ll do fine. To that I call bull, because the reason that the company is using this type of interview in the first place is that you’re supposed to be fitting a mold. For example, if an airplane started loading coach first, people would notice and be unnerved. It’s not necessarily wrong, it’s that it’s different than what they’re expecting and they will find a reason to hate it because of this. Humans are habitual creatures. There’s no other sane reason that they would use this sort of interview to screen for a position (mid-level) that will most assuredly not be choosing whether to use a SQL or NoSQL database for a service at the company other than “does this person fit a mold”. This isn’t to say that they are useless at determining whether a candidate knows how to handle tradeoffs, but at a certain level, asking someone to design a system in an hour that would typically take weeks requires a certain amount of playing to a script. /rant

So I took that botched system design interview, and I looked at what was asked, where I went wrong, and used it to inform my studying for the next interview. In my next system design interview, I was actually surprised to find I was able to design the system satisfactorily. Practice!

  1. When you falter during an interview, pay attention to how your interviewers respond.

You may work with these people. You will likely make mistakes. How do they react when you make a mistake? If the interview feels bad for reasons other than you just did poorly, it might be time to consider whether you want the job at all.

  1. Communicate with your company contact if you need to expedite the interviewing process.

I had never applied for multiple jobs like this, previously. It was an entirely new concept to me that it’s a delicate balance to sync interview timelines such that a company isn’t waiting for more than a few days after issuing an offer.

  1. Do not put all hope in one offer.

A good way to lose is to put too much mental emphasis in a certain company. It also psyches you out, and makes you much more likely to be nervous during your interview. It’s fine to know which job that you want, but there are really so many unknowns for each position that you really won’t know if you’ve made the right choice until after your first day of work.

  1. Utilize twitter.

I have my current job because of a tweet, and because some great people decided to retweet it and it reached the CEO of the company. Even if you don’t feel like you have a presence on twitter, it’s much more effective at reaching the right audiences than sending out cold-call applications.



from Hacker News https://ift.tt/3pvWnnt

Which brand/fuel/color/plate of car has the most “psychopath drivers”?

Lots of generalisations can be made about you as a driver, with people stereotyping your driving style based on your age, gender and job role. The type of car you drive can cause others on the road to judge you too, with certains brands or even colours bringing a reputation with them.

The phrase “driving like a psycho” might refer to someone driving dangerously, but are some drivers more psychopathic than others? And do people really share characteristics with others who drive the same car brand as them? Are some of these reputations valid?

We wanted to find out, so we tasked 2,000 drivers with taking a psychopathy test – the scores of which indicate how likely it is that you’d show traits of psychopathy (including superficial charm, grandiose sense of self-worth, and a lack of remorse or guilt).

From these results, we were then able to calculate an average psychopathy score for different groups of drivers, categorised by the brand, colour and customisation of the car you drive. Depending on what you drive, the results might shock you!

Which Drivers Have The Most Psychopathic Tendencies?

The drivers with the highest likeability of being a psychopath were BMW drivers, with an average psychopathy score of 12.1 out of 36 . Interesting, as BMW drivers already have a negative reputation on the roads, with a quick Google search revealing that the term ‘Why are BMW drivers…’ is followed by results including ‘so arrogant’, ‘idiots’ and ‘so hated’.

Our study revealed that the average score on the psychopath test across all drivers was 6.6, so BMW drivers are likely to show significantly more psychopathic traits than the average driver, with scores of almost double the average.

BMW drivers were followed closely behind in second place by Audi drivers, a fellow German automotive manufacturer. Audi drivers averaged a psychopathy score of 11.7, meaning they are more likely to adopt traits including narcissism and pathological lying than others.

Fiat (7.0), Mazda (6.4) and Honda (6.3) drivers made up the top five most psychopathic drivers, all with average scores of over five on the  test.

When it came to the drivers with the least psychopathic tendencies, Skoda drivers scored well below the average. They had an average score of just 3.2 out of 36, suggesting those driving a Skoda are unlikely to show signs of classic psychopathic traits, such as poor judgement or impulsivity, potentially making them calmer drivers on the roads.

Rank Car brand Average psychopathy score (/36)
1 BMW 12.1
2 Audi 11.7
3 Fiat 7.0
4 Mazda 6.4
5 Honda 6.3
6 Ford 6.1
7 Mercedes-Benz 5.9
8 Citroen 5.8
9 Volkswagen 5.4
10 Hyundai 5.3
11 Renault 5.3
12 Other 5.3
13 Volvo 5.2
14 Nissan 5.0
15 Peugeot 4.8
16 Toyota 4.7
17 Vauxhall 4.7
18 SEAT 4.3
19 Kia 4.2
20 Skoda 3.2

Which Fuel Types Correlate With High Psychopathy Scores?

With electric cars becoming more and more common, and the UK aiming to ban the sale of petrol and diesel cars by 2030, we also wanted to see if your fuel type of choice correlated with how much of a psychopath you could be.

Interestingly, those who drive an electric vehicle may be doing their part to help the environment, but they scored 16.0 out of 36 on average on the psychopathy test – the highest score of all the groups of drivers we surveyed, and edging closer to what the test defines as having ‘possible psychopathy’.

Hybrid cars followed in second position scoring almost 10 out of 36 (still well over the average score) and petrol drivers scored just 5.2 out of 36 on average, suggesting those currently sticking to petrol are unlikely to show traits of being a psychopath. 

Fuel type Average psychopathy score (/36)
Electric 16.0
Hybrid 9.8
Diesel 7.0
Petrol 5.2

Does Your Car Colour Define How Psychopathic You Could Be?

In addition to car brand and fuel type, there’s another obvious difference in the cars we drive: their colour. Could your choice of colour define how much (or little) psychopathic traits come through in your personality? We found that out too!

While gold cars are definitely not the most common colour of car you see driving down the motorway, it was drivers of this flash colour that had the highest average psychopathy scores in our study, scoring an average of 12.7 out of 36. Those who drive brown coloured cars followed closely behind scoring just over 12 on the psychopathy scale, and drivers of green coloured cars completed the top three most psychopathic drivers, according to car colour.

Rank Car colour Average psychopathy score (/36)
1 Gold 12.7
2 Brown 12.2
3 Green 8.5
4 Black 7.6
5 Blue 7.0
6 White 6.0
7 Silver 5.5
8 Red 4.9

Personalised Number Plate Owners Are Almost Three Times More Likely To Show Psychopathic Tendencies

Finally, we investigated whether having a personalised number plate could affect how high you scored on the psychopath test. In December 2020, the DVLA reported that nearly 370,000 personalised registrations had been purchased using their online service.

Our research found that those who own a personalised number plate are almost three times more likely to display psychopathic traits than those who own a standard number plate. Those with personalised number plates scored 13.8 out of 36, on average, while those with standard plates had a psychopathy score of only 5.3.

Number plate Average psychopathy score (/36)
Personalised 13.8
Standard 5.3

Ultimately, no matter what car you drive, or your score on the psychopathy test, ensure you’re staying safe while driving so that your car doesn’t have to prematurely hit the scrap heap.

If our findings have left you wondering where you would score on the psychopath test, then you can take it for yourself here:

While the findings from our study are interesting, and none of our drivers surveyed scored highly enough to suggest they do possess clear traits generally exhibited by a psychopath, psychopathy is a condition that affects lives, and therefore should be taken seriously.  

If you think you possess traits of psychopathy, or are worried about any of the themes or content discussed in this study, you can find more information and support at https://www.mind.org.uk/.

Methodology

We partnered with 3Gem to survey 2,000 UK drivers and analyse the brand and colour of the car they currently drive, in addition to whether they owned a personalised number plate or not.

Each participant then undertook a twelve question psychopathy test inspired by Psych Central, giving each driver a psychopathy score out of 36, to reveal how many psychopathic tendencies they may have. The scoring system is detailed below:.

  • 0-18 – no psychopathy
  • 19 -26 – psychopathy possible
  • 27 + – psychopathy likely

Average psychopathy scores were then calculated for car brands, car colours and types of number plate, to reveal which factors were most associated with psychopathic tendencies.

Survey conducted in November 2021.



from Hacker News https://ift.tt/3pfE4CF

Lazydocker: The lazier way to manage everything Docker

You can’t perform that action at this time.

You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.


from Hacker News https://ift.tt/31ZokY1

Pglet – Web UI framework for back end developers

You can’t perform that action at this time.

You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.


from Hacker News https://ift.tt/3c32WbI

Father-son duo helped techies ‘hack exams’, earn top scores for big payday

(L to R) Rajesh Kumar Shah, Deep Shah, and Aklakh Alam | Photo by special arrangement

Text Size:

New Delhi: The Intelligence Fusion and Strategic Operations (IFSO) unit of the Delhi Police has busted a “module” that has allegedly been taking online IT certification exams on behalf of students and professionals aiming to boost their career prospects in IT companies. So far, the police have arrested three people in connection with the money-for-marks scheme.

According to the police, the masterminds of the high-tech cheating racket are a father-and-son duo, Rajesh Kumar Shah and Deep Shah, who run an IT coaching institute in Ahmedabad, Gujarat. The two allegedly hired a Delhi-based technical expert, Aklakh Alam, to take the exams remotely for clients.

“We received intel that several services are available on the dark web, in which hackers claim they can get the desired score by hacking into the device used by the examinee,” Deputy Commissioner of Police (DCP) K.P.S. Malhotra told ThePrint. Another police source said that the accused charged around Rs 9,000-10,000, and gave exams for about 200 clients.

The trio apparently specialised in cracking various online tech certification exams. Getting a high score in these competitive exams can help IT aspirants get better placements, DCP Malhotra told ThePrint.

“Various international certifications are prerequisites to upgrade technical skills. These certifications are being provided by a number of reputed organisations — there are certifications from Cisco, CompTIA, EC-Council… these play a crucial role in the selection and pay grade of a candidate in the IT sector as well in other industries,” the officer said. He added that high scores in these competitive exams can make a big difference to the career progress of IT aspirants.

“These certifications are taken up worldwide, by huge IT companies like Microsoft, Google etc and higher packages are given to the aspirants,” another police source said.

“They have been running this scam since the Covid-19 outbreak, as all examinations shifted to an online mode. The latest intel we received was [about the] Pearson IT certification,” the source added.


Also Read: Fake websites, UPI hacking — Delhi saw 190% rise in cyber frauds during lockdown, police say


The crackdown

Based on intelligence gathered, a Delhi Police team arranged for a decoy to pose as an aspirant who was willing to pay to get high scores in the CompTIA A+ Certification (Core 1) examination.

According to the police, the decoy contacted the hackers using Voice over Internet Protocol (VoIP) communication and then transferred the fee to the account number specified to him. After this, the hacker asked the candidate to download a software called Iperius Remote.

“Through the software, [the hacker] gained control of the participant’s laptop and attempted the exam on 25 October. The decoy candidate passed the exam with a score of 736. Accordingly, a case was registered,” DCP Malhotra said.

The police first traced Deep Shah based on a technical analysis of the mobile number, bank account and internet IP address.

The modus operandi

According to the police, Deep and his father Rajesh gave potential clients a “100 per cent guarantee” of passing online certification exams. “Through their training centre they approached applicants who didn’t have the required knowledge and skills and promised them the desired score. They also contacted candidates through WhatsApp and Telegram,” the DCP said. He added that for actually attempting these exams, the father-son pair hired Alam who “hacked sites for various exams — WS (Amazon Web Services), Azure, CompTIA A+, PMP, CISM, CEH (Cyber Ethical Hacking), etc by getting remote access through apps”.

Alam, police said, holds top-level IT certifications in networking and has over 12 years of experience working as an A-grade network implementation and design engineer.

The police say that the first step of the hackers’ was to ask the candidates to download remote access software like Ultraviewer, Anydesk, or Iperius Remote. The next step was to install software into the client system that would escape detection by the security software of the exam-conducting company. Further, they used software that would make it difficult for examiners to detect anomalies in movements or pupil movements. Once all this was in place, IT expert Alam would attempt the exam.

“These competitive examinations are recognised worldwide. The exams which were earlier organised offline, are now being conducted online through various software. For instance, Cisco offers certifications for beginners, associates, experts in technology. All of these exams require knowledge and a particular skillset, such as programming, to score,” the police source said.


Also Read: Not just CCTV & invigilators, CBSE will use data analytics to identify cheating in exams


 

Subscribe to our channels on YouTube & Telegram

Why news media is in crisis & How you can fix it

India needs free, fair, non-hyphenated and questioning journalism even more as it faces multiple crises.

But the news media is in a crisis of its own. There have been brutal layoffs and pay-cuts. The best of journalism is shrinking, yielding to crude prime-time spectacle.

ThePrint has the finest young reporters, columnists and editors working for it. Sustaining journalism of this quality needs smart and thinking people like you to pay for it. Whether you live in India or overseas, you can do it here.

Support Our Journalism



from Hacker News https://ift.tt/3Eao6zX

Lessons from Seven Years of Remote Work

The inspiration for this post is Željko Filipin’s post on the same topic.

Nobody worked remotely during the pandemic, but everybody worked from home.

During the pandemic, office workers had to adjust to working out of their homes. But remote work is different: you’re not working from home, necessarily; you’re working while seperated from a lively, in-person office. You might be in the same city as your co-workers or on the other side of the world.

When you’re physically disconnected from your colleagues, you have to build new skills and adapt your tactics. This is advice I wish I’d had seven years ago when I started working remotely.

Asynchronous communication

Office workers have the luxury of hallway conversations. In an in-person office, getting feedback takes mere minutes. But in a remote work position where you may be on the other side of the planet, communication may take overnight.

To be effective, you need to master asynchronous communication. This means:

Timezones suck

I wish this section was as simple as saying: use UTC for everything, but it’s never that easy. You should definitely give meeting times to people in UTC, but you should tie meetings to a local timezone. The alternative is that your meetings shift by an hour twice a year due to daylight savings.

This all gets more complicated the more countries you have involved.

While the United States ends daylight savings time on the first Sunday in November, many countries in Europe end daylight savings on the last Sunday in October, creating a daylight confusion time.

During daylight confusion time, meetings made by Americans may shift by an hour for Europeans and vice-versa.

I think the only thing to learn from this section is: you’ll mess it up.



from Hacker News https://ift.tt/3p9iLCA

The Best Way to Hug Someone, According to Science

Have you heard of the saying: “A hug a day keeps the doctor away?” Perhaps, not.

Probably because it’s not a saying. But it’s certainly an assertion scientists have previously made based on the immunity-boosting health benefits of hugs. Hugs are society’s favorite form of expressing affection; this may be because they increase the oxytocin levels — or the “cuddle hormone” — in our bodies, leading people to associate the gesture with feelings of happiness.

But there is no conclusive formula of what is the “best” way to hug someone. How long should you hold people for? How much pressure should you apply? How should you cross your arms while holding people? So many questions. If you have ever wondered about these — or, like me, felt anxious about not knowing the right way to hug people — scientists recently decoded what makes a hug rather “pleasurable.”

Published in Acta Psychologica this month, a new study attempted to assess, and even quantify, the factors which influence how much we enjoy hugs.

Turns out, for most people hugs that lasted less than one second were the least pleasurable; the ones lasting between five to 10 seconds, the most. “If 10 seconds sounds like an uncomfortably long time to hug a stranger, you’re not alone,” Science reported, noting the findings surprised even the authors of the study.

Additionally, “something that I would’ve liked to see in the study is the condition where you really extend the hug even more,” Julian Packheiser, a biopsychologist at Ruhr-Universität Bochum in Germany, who studies the effects of hugs on the body and brain, noted. He was not involved in the present research.


Related on The Swaddle:

We’re All Touch‑Deprived Under Lockdown. It Can Affect Our Self‑Perception.


Then there is the matter of the style of hugging. The researchers noted neither the emotional closeness nor the height of people looking to lock each other in an embrace had much bearing on their style of hugging. However, if two people were of nearly the same height, the “neck-waist approach” was found to be slightly more common than when their heights differed drastically.

In general though, the “crisscross style” was found to be way more common than other approaches — accounting for almost 66 out of every 100 hugs. Scientists believe the “crisscross style” is something people perceive as “more egalitarian,” or “convey[ing] closeness without adding romantic subtext.” That explains why an overwhelming number of pairs of two men — 82% — preferred this style while hugging each other.

As an article in GQ stated, “admittedly primitive heterosexual norms… deem tenderness among males not ‘masculine.”‘ Then again, the present study isn’t really clear on how many of the men involved were heterosexual.

Unfortunately, the study didn’t focus on decoding the “right” amount of pressure to apply while hugging them. But hug-scientists do have their theories on the matter. “If it’s a romantic thing, [pressure] can be much more than if it’s a casual thing,” Packheiser explained.

The study sums up its finding on “pleasurable hugs” thus: “We advise using a five-second criss-cross hug to model a familiar and pleasant type of experience.”

For more questions about hugs that the present study didn’t answer, thankfully, this is hardly the first time scientists have tried to understand hugging norms. A 2018 study found that most people prefer right-sided hugs — even though left-sided hugs were found to be more common in positive as well as negative situations.

“This is because of the influence of the right hemisphere [of the brain], which controls the left side of the body and processes both positive and negative emotions… When people hug, emotional and motor networks in the [right hemisphere of the] brain interact and cause a stronger drift to the left in emotional contexts,” Packheiser, who was the lead author of the 2018 study, had said, explaining what makes people prefer the right side slightly more.


Related on The Swaddle:

How a Professional Cuddler‑in‑Training Figured Out Emotional Cuddles With Friends


If you have more questions still (I do too!), you may find solace in this note from the present research: “We anticipate that the studies presented here will provide a foundation for future research on pleasant touch, especially for research on hugs, which are highly prevalent but still widely understudied.”

Isn’t it great timing for the study findings to be published right before the holiday season though — a ready reckoner of sorts? However, as wholesome as hugs might feel to many, it’s also important to remember that not everyone enjoys a “jaadoo ki jhappi.”

Some people suffer from haphephobia, or a fear of being touched, which can make hugs overwhelming for them. People may spiral into nausea, hyperventilation, or even, panic attacks; while the causes of haphephobia remain unknown, experts hypothesize it is a result of trauma. For several people on the autism spectrum too, hugs can be uncomfortable. As an autistic person, I have been secretly rejoicing the hug-less state of affairs, and hoping for a more hug-averse society.

Social anxiety, too, can make people hug-avoidant. “People who have higher levels of social anxiety, in general, may be hesitant to engage in affectionate touches with others, including friends,” Suzanne Degges-White, a professor of Counseling and Counselor Education at Northern Illinois University, in the U.S., explained.

In the meantime, it’s important to bear in mind that we’re still caught in the middle of a global health crisis — with experts even worrying that India may face the third wave in December. So, there’s wisdom in being cautious and following social distancing norms.

But decoding the mystery of hugs, and the many nuances at play here, may make touch more considerate and comforting.



from Hacker News https://ift.tt/3cPEolX

Questioning the Already Questionable State of Global Demand

If markets seem a bit on edge, I guess omicron seems a good reason if for no other reason than we don’t know much about it. But even that reaction points toward something else. A truly robust economy has little to fear from such unknowns, even from what might be predictable overreaction across the entire public sphere.

The knee-jerk negative sentiment provoked only just recently speaks to far more underlying uncertainty. We may not know what the newest COVID variant holds, but it may not actually matter. The economy is already questionable and has been more and more for quite some time already, having topped-out maybe the rebound simply cannot afford any more questions of even the lightest potential variety.

In other words, the issue at hand isn’t necessarily omicron nor really a “growth scare” grown more scary by another possible round of the pandemic.

This is about demand.

That’s why, I believe, the sharpest reaction came from something like oil which up to now has been surging based on the physical world undersupplying given a previous perception of steady demand. If “the market” starts thinking omicron’s another chapter to the corona story alone, then we’d also expect any more lockdowns to further menace supply.

Such a hypothetical situation would be oil price-positive, not so thoroughly (for the short run, anyway) sell-at-any-price.



No, the market is, right now, further questioning the demand side of the physical price formulation – and it is doing so on top of substantial questions about demand which have been lingering around for more than half a year so far. No matter if unreported in the mainstream financial media still cheering on a BOND ROUT!!!! that refuses to materialize, jittery traders have been moving in this direction (flat curves, after all) long before the unwelcome whispers.

Given that, the latter merely a pile on.

And what better way to try and assess global demand than from a Chinese perspective. Not just the world’s second biggest economy, this is the nexus between developed and emerging, the pivotal junction of economy and money (eurodollar) therefore the juxtaposition of all these factors at once.



Late last night, China’s National Bureau of Statistics (NBS) reported what were further reported to be “strong” results in its sentiment indicators, the various official PMI’s. I’m not sure in what world these numbers would be consistent with that interpretation, it’s just not the one we all inhabit.

Their manufacturing sector PMI barely managed above fifty, coming in at 50.1 for November 2021. That’s up from 49.2 in October, for one month breaking a string of declines dating back to March – which was a one-month break to a string of prior declines. And this is the “good” news.



NBS’s non-manufacturing or services PMI declined to 52.3 this month from 52.4 last month. Both of those are incredibly low by any historical comparison except the depths of last year as well as August this year.

This latter low is important to point out and appreciate because that was the work of the delta variant (the Chinese government imposing restrictions for it). And while that may have been the reason for the depths of both PMI’s and more direct economic data like retail sales, also like retail sales the rebound from August is hardly what any rational observer would claim robust.

On the contrary, thus far post-August the numbers have all been materially weaker than before August; continuing instead the post-March slowdown which had raised the uncertainty about global demand in the first place long before either delta or omicron.




These PMI’s along with China’s Comprehensive Output PMI (a composite made specifically from the Output indices of both the manufacturing and non-manufacturing versions weighted by each sector’s contribution to Chinese GDP) only demonstrate how delta had been a temporary amplification of the existing downward trend.

Like the services PMI, this composite in November at just 52.2 is more alike 2018-19’s downturn/global pre-recession than even 2017 let alone late 2020.

All of them having clearly set their highs way back in March when the global bond market, and UST yield curve, began to raise doubts about realistic prospects for demand going forward.

Even on the upswing from summer’s delta variant in China, there’s so little up to the swing by which to be impressed or convinced there’s so much more to it coming over the coming months. All that even before omicron.

Looking across to other global data, including US consumer sentiment, it’s easy to see just how this isn’t just China, either.

The market’s knee-jerking the past few trading days isn’t really pandemic There remain serious problems and doubts as to the (largely unreported) underlying state of the entire global situation, demand first and foremost, which seems to be sliding (though, importantly, not falling off a cliff) no matter what state of the pandemic.










from Hacker News https://ift.tt/3lnAdCz

Haiku Now Has Experimental 3D Acceleration

I looked at Zink code a bit and it seems possible to replace softpipe driver of existing Haiku Gallium add-on with Zink driver. It use memcpy() to draw surface:

static void
zink_flush_frontbuffer(struct pipe_screen *pscreen,
                       struct pipe_context *pcontext,
                       struct pipe_resource *pres,
                       unsigned level, unsigned layer,
                       void *winsys_drawable_handle,
                       struct pipe_box *sub_box)
{
   struct zink_screen *screen = zink_screen(pscreen);
   struct sw_winsys *winsys = screen->winsys;
   struct zink_resource *res = zink_resource(pres);

   if (!winsys)
     return;
   void *map = winsys->displaytarget_map(winsys, res->dt, 0);

   if (map) {
      VkImageSubresource isr = {};
      isr.aspectMask = res->aspect;
      isr.mipLevel = level;
      isr.arrayLayer = layer;
      VkSubresourceLayout layout;
      vkGetImageSubresourceLayout(screen->dev, res->image, &isr, &layout);

      void *ptr;
      VkResult result = vkMapMemory(screen->dev, res->mem, res->offset, res->size, 0, &ptr);
      if (result != VK_SUCCESS) {
         debug_printf("failed to map memory for display\n");
         return;
      }
      for (int i = 0; i < pres->height0; ++i) {
         uint8_t *src = (uint8_t *)ptr + i * layout.rowPitch;
         uint8_t *dst = (uint8_t *)map + i * res->dt_stride;
         memcpy(dst, src, res->dt_stride);
      }
      vkUnmapMemory(screen->dev, res->mem);
   }

   winsys->displaytarget_unmap(winsys, res->dt);

   assert(res->dt);
   if (res->dt)
      winsys->displaytarget_display(winsys, res->dt, winsys_drawable_handle, sub_box);
}



from Hacker News https://ift.tt/3d2rEbx

Why Decentralization Matters

The first two eras of the internet

During the first era of the internet — from the 1980s through the early 2000s — internet services were built on open protocols that were controlled by the internet community. This meant that people or organizations could grow their internet presence knowing the rules of the game wouldn’t change later on. Huge web properties were started during this era including Yahoo, Google, Amazon, Facebook, LinkedIn, and YouTube. In the process, the importance of centralized platforms like AOL greatly diminished.

During the second era of the internet, from the mid 2000s to the present, for-profit tech companies — most notably Google, Apple, Facebook, and Amazon (GAFA) — built software and services that rapidly outpaced the capabilities of open protocols. The explosive growth of smartphones accelerated this trend as mobile apps became the majority of internet use. Eventually users migrated from open services to these more sophisticated, centralized services. Even when users still accessed open protocols like the web, they would typically do so mediated by GAFA software and services.

The good news is that billions of people got access to amazing technologies, many of which were free to use. The bad news is that it became much harder for startups, creators, and other groups to grow their internet presence without worrying about centralized platforms changing the rules on them, taking away their audiences and profits. This in turn stifled innovation, making the internet less interesting and dynamic. Centralization has also created broader societal tensions, which we see in the debates over subjects like fake news, state sponsored bots, “no platforming” of users, EU privacy laws, and algorithmic biases. These debates will only intensify in the coming years.

“Web 3”: the third era of the internet

One response to this centralization is to impose government regulation on large internet companies. This response assumes that the internet is similar to past communication networks like the phone, radio, and TV networks. But the hardware-based networks of the past are fundamentally different than the internet, a software-based network. Once hardware-based networks are built, they are nearly impossible to rearchitect. Software-based networks can be rearchitected through entrepreneurial innovation and market forces.

The internet is the ultimate software-based network, consisting of a relatively simple core layer connecting billions of fully programmable computers at the edge. Software is simply the encoding of human thought, and as such has an almost unbounded design space. Computers connected to the internet are, by and large, free to run whatever software their owners choose. Whatever can be dreamt up, with the right set of incentives, can quickly propagate across the internet. Internet architecture is where technical creativity and incentive design intersect.

The internet is still early in its evolution: the core internet services will likely be almost entirely rearchitected in the coming decades. This will be enabled by crypto-economic networks, a generalization of the ideas first introduced in Bitcoin and further developed in Ethereum. Cryptonetworks combine the best features of the first two internet eras: community-governed, decentralized networks with capabilities that will eventually exceed those of the most advanced centralized services.

Why decentralization?

Decentralization is a commonly misunderstood concept. For example, it is sometimes said that the reason cryptonetwork advocates favor decentralization is to resist government censorship, or because of libertarian political views. These are not the main reasons decentralization is important.

Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.

07lrwGIDbAYk6q7zG

When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs. Netscape, Google vs. Yelp, Facebook vs. Zynga, and Twitter vs. its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.

For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.

Enter cryptonetworks

Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, Golem for performing computations, and Filecoin for decentralized file storage.

Early internet protocols were technical specifications created by working groups or non-profit organizations that relied on the alignment of interests in the internet community to gain adoption. This method worked well during the very early stages of the internet but since the early 1990s very few new protocols have gained widespread adoption. Cryptonetworks fix these problems by providing economics incentives to developers, maintainers, and other network participants in the form of tokens. They are also much more technically robust. For example, they are able to keep state and do arbitrary transformations on that state, something past protocols could never do.

Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for “voice” and “exit.” Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol.

In short, cryptonetworks align network participants to work together toward a common goal — the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy skeptics and flourish, even while new cryptonetworks like Ethereum have grown alongside it.

Today’s cryptonetworks suffer from limitations that keep them from seriously challenging centralized incumbents. The most severe limitations are around performance and scalability. The next few years will be about fixing these limitations and building networks that form the infrastructure layer of the crypto stack. After that, most of the energy will turn to building applications on top of that infrastructure.

How decentralization wins

It’s one thing to say decentralized networks should win, and another thing to say they will win. Let’s look at specific reasons to be optimistic about this.

Software and web services are built by developers. There are millions of highly skilled developers in the world. Only a small fraction work at large technology companies, and only a small fraction of those work on new product development. Many of the most important software projects in history were created by startups or by communities of independent developers.

“No matter who you are, most of the smartest people work for someone else.” — Bill Joy

Decentralized networks can win the third era of the internet for the same reason they won the first era: by winning the hearts and minds of entrepreneurs and developers.

An illustrative analogy is the rivalry in the 2000s between Wikipedia and its centralized competitors like Encarta. If you compared the two products in the early 2000s, Encarta was a far better product, with better topic coverage and higher accuracy. But Wikipedia improved at a much faster rate, because it had an active community of volunteer contributors who were attracted to its decentralized, community-governed ethos. By 2005, Wikipedia was the most popular reference site on the internet. Encarta was shut down in 2009.

The lesson is that when you compare centralized and decentralized systems you need to consider them dynamically, as processes, instead of statically, as rigid products. Centralized systems often start out fully baked, but only get better at the rate at which employees at the sponsoring company improve them. Decentralized systems start out half-baked but, under the right conditions, grow exponentially as they attract new contributors.

In the case of cryptonetworks, there are multiple, compounding feedback loops involving developers of the core protocol, developers of complementary cryptonetworks, developers of 3rd party applications, and service providers who operate the network. These feedback loops are further amplified by the incentives of the associated token, which — as we’ve seen with Bitcoin and Ethereum — can supercharge the rate at which crypto communities develop (and sometimes lead to negative outcomes, as with the excessive electricity consumed by Bitcoin mining).

The question of whether decentralized or centralized systems will win the next era of the internet reduces to who will build the most compelling products, which in turn reduces to who will get more high quality developers and entrepreneurs on their side. GAFA has many advantages, including cash reserves, large user bases, and operational infrastructure. Cryptonetworks have a significantly more attractive value proposition to developers and entrepreneurs. If they can win their hearts and minds, they can mobilize far more resources than GAFA, and rapidly outpace their product development.

“If you asked people in 1989 what they needed to make their life better, it was unlikely that they would have said a decentralized network of information nodes that are linked using hypertext.” — Farmer & Farmer

Centralized platforms often come bundled at launch with compelling apps: Facebook had its core socializing features and the iPhone had a number of key apps. Decentralized platforms, by contrast, often launch half-baked and without clear use cases. As a result, they need to go through two phases of product-market fit: 1) product-market fit between the platform and the developers/entrepreneurs who will finish the platform and build out the ecosystem, and 2) product-market fit between the platform/ecosystem and end users. This two-stage process is what causes many people — including sophisticated technologists — to consistently underestimate the potential of decentralized platforms.

The next era of the internet

Decentralized networks aren’t a silver bullet that will fix all the problems on the internet. But they offer a much better approach than centralized systems.

Compare the problem of Twitter spam to the problem of email spam. Since Twitter closed their network to 3rd-party developers, the only company working on Twitter spam has been Twitter itself. By contrast, there were hundreds of companies that tried to fight email spam, financed by billions of dollars in venture capital and corporate funding. Email spam isn’t solved, but it’s a lot better now, because 3rd parties knew that the email protocol was decentralized, so they could build businesses on top of it without worrying about the rules of the game changing later on.

Or consider the problem of network governance. Today, unaccountable groups of employees at large platforms decide how information gets ranked and filtered, which users get promoted and which get banned, and other important governance decisions. In cryptonetworks, these decisions are made by the community, using open and transparent mechanisms. As we know from the offline world, democratic systems aren’t perfect, but they are a lot better than the alternatives.

Centralized platforms have been dominant for so long that many people have forgotten there is a better way to build internet services. Cryptonetworks are a powerful way to develop community-owned networks and provide a level playing field for 3rd-party developers, creators, and businesses. We saw the value of decentralized systems in the first era of the internet. Hopefully we’ll get to see it again in the next.

Originally published on Medium.



from Hacker News https://ift.tt/3czspYx

Sorry, but you're already living in the “Squid Game”

You’ve probably heard about the hugely popular South Korean hit show Squid Game. It has resonated with audiences around the globe because, beyond its aesthetic and narrative appeal, it brilliantly captures the excesses of work today while revealing our worst instincts. Our expert Laetitia Vitaud talks about this global success story (warning: contains spoilers).

It wouldn’t be an exaggeration to call Squid Game the biggest TV sensation of 2021.

The series tells the story of a group of cash-strapped contestants caught in a game where the losers die, with one survivor taking a huge prize. The challenges are based on popular Korean children’s games.

At first glance, this dystopian vision is nothing new. From The Game to Battle Royale and The Hunger Games, cinema has often drawn on gladiator-style death matches as a metaphor for our powerlessness. Tossed around by fate, battered by a capitalist society in which we are mere pawns, we “play” to survive and, sometimes, to amuse the powerful of this world.

But Squid Game is more than just another take on a familiar theme. With approximately 111 million views in just one month, it is now Netflix’s biggest-ever series launch. The show is another indicator of the Korean wave and the growing influence of Korean culture in the West. It follows in the footsteps of the 2019 Oscar-winning film Parasite, a dark vision of a world where the poor survive off crumbs left by the rich. In much the same way, Squid Game exposes the injustices, absurdities and conflicts inherent in the Korean labour market.

Since becoming a major industrial power in the 1980s, Korea is now stuttering somewhat and still relies on the immense gains of the industrial conglomerates (chaebol) over the past century. But cultural conflicts between old and new have been transforming South Korean society. Traditional patriarchal values are under attack, while the social critique inherent in works such as Squid Game and Parasite have really hit home with audiences. Today, the chaebols are coming under fire for widespread corruption and cronyism. There are also the twin crises of excessive household debt and the continuing lack of social-security provision/insurance. And in an extraordinarily sexist society, women choose not to have children so they can hold onto their career. This has led to a fertility rate that, along with Taiwan’s, is the lowest in the world, with approximately one child per woman.

But Squid Game has struck a chord with global audiences because its social critique is relevant outside the “Land of the Morning Calm.” It reflects growing anxieties over housing costs, excessive debt, increased competition between individuals, a shortage of good and meaningful work, digital surveillance and many other things that don’t just affect South Koreans.Squid Game’s message is universal, revealing everything that’s wrong with work as the world shifts from the industrial to the digital paradigm.

Here are six ways in which Squid Game mirrors everyday life.

Explore more in our section: Decision Makers

Managers: did you know your emotions are highly contagious?

Winner takes all: the economic principle that turns people into wolves

In the series:

In Squid Game, the last person standing wins a huge sum of money. Each time a contestant dies, their “share” is put into the perverse jackpot. This mirrors the economic principle of “winner takes all.”

In everyday life:

In an increasingly digital economy, many see this principle as having become ubiquitous. Due to the network effect, natural monopolies emerge, with just a few winners – or, what’s even more likely, a single winner – in each market. That’s why so many startups use the hefty funds they’ve raised from investors to secure the top spot in their market. The whole point is to ensure only one company is left at the end. A startup must come out on top and eliminate the competition in order to survive. As Azeem Azhar explains in his book Exponential: How Accelerating Technology Is Leaving Us Behind and What to Do About It, “While we’ve always had firms that do better than others, the difference between the best and the worst is greater than ever.”

The business world also follows this logic to an increasing degree, where the “best” take home virtually all the money. And for half a century, the gap between the richest 10% and the poorest 10% has been widening steadily. While millions of employed people are struggling to find housing and make ends meet, big-tech billionaires amass fortunes that are increasingly hard to fathom.

Silicon Valley’s winner-takes-all approach has recently come under fire. First, not all markets follow this logic. And second, the polarisation of the labour force is not simply down to economics. It’s also a consequence of political choices such as lessening the redistribution of wealth. What if success didn’t mean the elimination of all competition? This question is at the core of current critiques regarding the world of work.

Learn more about: Modern Work

The toxic myth of meritocracy

In the series:

In Squid Game, the organisers insist that every player has an equal chance at winning. Cheating is off-limits. Some of the characters even remark that the rules are fair and straightforward. In their eyes, “real life” is worse because following the rules doesn’t get you ahead. Meritocracy is a fallacious idea – a convenient lie told to those whose lack of wealth or social class leaves them at a distinct disadvantage. In the series, the sole immigrant character and few female players are eliminated after experiencing discrimination from their fellow players. In other words, it’s still a long time before a Korean series will show a Pakistani immigrant, or a woman, winning.

In everyday life:

While the critique of an oppressive meritocracy is nothing new, it has been gaining ground in both the US and Europe. An example was the Black Lives Matter movement inspiring people around the world in 2020. But as sociologists and behavioural economists have determined, multiple biases govern the game we play. There’s no such thing as a level playing field, with birth continuing to have a great influence on determining our career path. Even American society, which until recently believed in the “American Dream,” is no longer fooled. Meritocracy is false. In his book The Tyranny of Merit, the philosopher Michael Sandel argues that we must rethink the attitudes to success and failure that have grown alongside globalisation and rising inequality.

For Sandel, the hubris that a meritocratic culture generates in the winners is deeply toxic. It imposes harsh judgments on those left behind, who are then deprived of any means of escape from their condition. Sandel believes that success should be redefined according to an ethos of humility and solidarity, with greater respect for the dignity of work. For others, exploding the myths of meritocracy means greater redistribution of wealth and offering everyone the same chances for a rewarding professional life.

Our social protection is lacking – time to reinvent it!

In the series:

The daily life of the characters in Seoul reveals a striking lack of social protection. Those who lose their jobs risk falling into an unending spiral of debt. In terms of consumer debt, Korea is a world leader. Since 2010, its GDP has been growing at a slower pace than the debt of South Korean households. In the series, those in debt are vulnerable to all sorts of predators, including underground organ traffickers and recruiters for deadly games aimed at entertaining a hidden gallery of VIPs. All they needed was decent unemployment insurance!

In everyday life:

Social protection is a set of collective welfare mechanisms aimed at helping individuals and households cope with risks such as unemployment, illness, disability or old age. Around the world, social-welfare systems vary, with South Korea lagging behind most European countries. But as the world transitions from the industrial to the digital paradigm, a greater number of people are falling through the holes in the safety net. There is a global crisis in dealing with what happens to self-employed people who lose their jobs, part-time workers who live alone, and all those who don’t have access to decent housing.

In the United States, the shortcomings of the social-security system have been hotly debated in recent years. Led by a new generation of politicians, such as Alexandria Ocasio-Cortez, people are calling for parental leave, improved childcare, and greater protection and a more level playing field for the employed. While the South Korean safety net is the bare minimum, social protection needs improvement worldwide.

Alienated by constant surveillance

In the series:

Each player in Squid Game is given a number and tracked, reminiscent of a forced-labour camp. The game master has a gigantic display to monitor who is still left in the competition. Between games, players spend the hours in a dormitory where they remain under constant supervision. The dorm even serves as a testing ground for game designers and a source of entertainment for the VIPs – elderly white males – who have come for the show. There is a blurring of the boundaries between rest and play, with one seeping into the other. And, indeed, the contestants are unwittingly observed during their downtime.

In everyday life:

The pervasiveness of digital workplace tools has made some tasks easier and often improved conditions. But it has also opened the door to monitoring. With mobile devices, managers can track employee movements and continuously gauge performance. Collaborative tools, meanwhile, clock people in and out remotely. Electronic messaging – particularly email – reinforced by the increase in telecommuting has completely blurred the lines between the workplace and the home. Many feel that their employers are watching them, even in their private lives. Like a giant panopticon, working in the digital age comes with the prospects of continual surveillance and growing alienation.

Criticism of surveillance by big tech is growing. Facebook’s reputation has been damaged by the impression that, to the company, people are just numbers to help sell advertising space. But for many digital businesses, users are simply a means to an end. In her groundbreaking book The Age of Surveillance Capitalism, Shoshana Zuboff explains how digital companies have made a business out of our personal data. For her, surveillance capitalism threatens free will, just as it does for democracy.

The extreme infantilization of employees

In the series:

The challenges in Squid Game are all taken from children’s games typical in Korean culture. One challenge involves Red Light, Green Light with a giant, creepy doll, while another takes place in a playground with slides. There’s also a series of alleyways and side streets configured for marbles. The infantilization of adults and nostalgia for childhood are two central themes in Squid Game.

In everyday life:

In recent years, there has been increasing criticism of the use of video-game principles by digital platforms to influence users’ decisions to stay connected and continue working. Algorithmic management turns users into children to be manipulated, denying them their free will. In more traditional organisations, it’s the division of labour and subordination that treat employees like children denied their independence. Their performance and presence in the office are monitored because, like children, they might do something foolish if left to their own devices. As populations age, the cult of youth often pushes adults who want to work into trying to look younger than they are, as though being an adult holds no value. This is even more true in South Korea, where 1.2 million cosmetic-surgery procedures are performed annually.

But being a responsible adult means having more dignity at work. This idea leads many people to become self-employed. Values associated with artisanal work, such as autonomy, responsibility and creativity, are popular because more people reject the infantilisation inherent in industrial work. Treating people like children was one way to subdue them. Today’s professionals want to be treated as responsible adults.

Working in a man’s world: a losing game for women

In the series:

Of all the contestants, only 10% are female. Like the South Korean workplace, the world depicted in Squid Game is not kind to women. Male players don’t want them on the team because the women are seen as weak, incapable and hysterical. The female protagonists embody a tough choice facing South Korean women today who want a career. Either they play the game of submitting to men for protection, or they get rid of men altogether and form alliances with other women. In this scenario, neither the humbled straight woman nor the rebellious queer woman is the winner.

In everyday life:

At work, many women are forced to play games to make them forget they are women. They often find themselves typecast in stereotypical roles such as the mother or the whore. Inequalities in domestic chores are still huge obstacles when it comes to professional equality. When you have to do all the unpaid work, there’s less room for a career. In Europe, the United States and Korea, the “motherhood penalty” continues to take its toll. Many women of retirement age struggle financially, especially if they have raised children alone. Faced with this prospect, more and more women are refusing to pay this price and choose not to have children instead. Meanwhile, in companies where there’s only room for one woman at the top, female rivalry emerges. But even in Korea, more and more women would like to break free from the extreme sexism of the corporate world and rely on sisterhood, just like the character of Kang Sae-byeok, Squid Games’ queer North Korean.

In short, Squid Game is a cry of despair and the expression of overwhelming frustration. The series echoes the “great resignation” of employed people who are no longer willing to accept the working conditions imposed on them. In the United States and Europe, they are in a winning position due to the ongoing labour shortage. At the same time, wages are going up, and the widening gap of inequality seems to be slowing down. And while debates about social protection abound, feminist demands are sometimes being heard. So, what does Squid Game’s success tell us? As far as contemporary attitudes to work are concerned, the tide may well be turning.

Photo by Thomas Decamps

Article edited by Ariane Picoche

Translated by Andrea Schwam

Follow Welcome to the Jungle on Facebook on LinkedIn and on Instagram and subscribe to our newsletter to get our latest articles every day!



from Hacker News https://ift.tt/3rk6OwP

Atari took on Apple with the Atari 400 and Atari 800 PCs

Forty years ago, Atari released its first personal computers: the Atari 400 and 800. They arrived in the fall of 1979 after a prerelease marketing campaign that had begun the previous January when the company unveiled the machines at what was then called the Winter Consumer Electronics Show in Las Vegas.

Then as now, “Atari” was synonymous with “video game,” and the new machines packed more technological potential than any game console at the time, with custom graphics and sound chips, support for four joysticks or eight paddles, and the ability to play games on cartridge, cassette, or disk. At launch, one of the machines’ first games, Star Raiders, defined cutting-edge home entertainment.

And yet Atari initially marketed the 800 and its lower-cost counterpart, the Atari 400, as “second-generation” PCs—productivity machines with enhanced graphics and sound capabilities over the 1977 holy trinity of personal computing: the Apple II, Commodore PET, and TRS-80. The company intended them to crunch home budget numbers just as often as they simulated space battles.

Idiot-proof and rugged, Atari’s Home Computer System machines (I’ll call the platform “HCS” for short) represented a huge leap in consumer-friendly personal computing. Unlike many PCs of the time, the Atari machines exposed no bare electronics to the consumer. Unique keyed connectors meant that all of the machines’ ports, modules, and cartridges couldn’t be plugged into the wrong places or in the incorrect orientation. The 400 even featured a flat spillproof keyboard aimed at fending off snack-eating children.

And due to restrictive FCC rules that precluded the open expansion slots on the Apple II, Atari designed a suite of intelligent plug-and-play peripherals linked together by a serial IO bus that presaged the ease of the much-later USB.

In some ways the Atari computers even exceeded the state of the art from Atari’s coin-op department: In 1979, most Atari arcade games shipped with black-and-white monitors, using translucent gel overlays to generate pseudo-color. The Atari computers played games in color from the start—if the consumer provided the color TV set, of course.

At launch, the Atari 800 retailed for $999 with 16K of RAM (about $3,387 when adjusted for inflation), and the Atari 400 with 8K retailed for $549 (about $1,861 today). Compared to a game console such as the Atari VCS at $190, that was expensive, but it undercut the 16K Apple II’s $1,195 retail price in 1979.

This fancy retail kiosk let consumers learn about Atari’s computers—and even partake in a game of Pac-Man. [Photo: courtesy of Atari]
My own association with Atari’s computers goes back to 1981, when my father bought an Atari 800 for my older brother Jeremy, five years my senior. I grew up watching him wear out its joysticks by the half-dozen while mastering his skills in Asteroids, Dig Dug, and Archon. And the Atari served as more than a game machine for him. With its BASIC programming cartridge, the Atari opened up software as a malleable thing that could be shaped at will. It was on the Atari 800 that my brother amazed me with his homemade BASIC simulations of aircraft dogfights, among other wonders that my 4-year-old mind could hardly fathom but loved nonetheless. He later became a software engineer.
The author’s brother and a neighbor enjoy some Atari 800 quality time circa 1983. [Photo: courtesy of Benj Edwards]
Decades later, I still play Atari 800 games with my kids. It’s my home entertainment version of comfort food—a rich pastime best enjoyed by a roaring fireplace in a wood-paneled den. The Atari home computers projected a distinctive voice as an entertainment platform that I can’t get anywhere else. Games such as M.U.L.E., The Seven Cities of Gold, and Star Raiders take me back to a golden era in PC gaming and remind me that technology can create timeless classics as well as any other medium.
The author’s brother programming the Atari 800 in BASIC. [Photo: courtesy of Benj Edwards]
I’ve often wondered what cultural and business elements came together to make this breakthrough platform—this favorite electronic friend from my childhood. With some digging, I recently found out.

Video game genesis

In 1977, Atari released its first video game console with interchangeable cartridges, the Video Computer System, or VCS. (The company would later rechristen the machine “Atari 2600” from its model number, CX-2600.) A group of Atari engineers led by Jay Miner anticipated a three-year market lifespan for the 2600, which contained only 128 bytes of RAM. (As a frame of reference, the Nintendo Switch has more than 30 million times as much.)

That same year, Atari’s home computer platform began to take shape as a high-powered follow-up to the 2600. Many questions swirled around the next-generation machine. Should it remain compatible with the VCS but offer more features? Or should Atari make a clean break with the past and launch a far more advanced design?

Personal computers became the new cool thing, and Atari’s engineers wanted a piece of the action.”

“I was in the Homebrew Computer Club when Steve Wozniak introduced the Apple I in the winter of 1976,” says Joe Decuir, one of the Atari 800’s chipset architects and a veteran of the 2600 team. Decuir had begun working at Atari in 1975, hired to help with the VCS design. “One of the reasons I took the job is I thought that the project after a game machine was going to be a computer,” he explains.

Atari engineers weren’t blind to events around them in Silicon Valley. Ideas cross-pollinated between companies through social connections, local interest groups, and employee poaching between firms. One of the most important technological and societal movements of the 20th century had been taking shape: the birth of the personal computer. PCs became the new cool thing, and Atari’s engineers wanted a piece of the action.

Decuir says, “A lot of us were kicking around ideas about what a computer would do while we were doing [the 2600]. And as the core of the game machine grew, Jay Miner and I and company became the core of the computer design group, which was a much larger project.”

This group included talented Atari engineers such as Steve Mayer, Francois Michel, George McLeod, Doug Neubauer, Mark Shieu, and others. (Later, Doug Hardy and Kevin McKinsey handled industrial design.) After some brainstorming, the engineering group began with a simple goal: to take the 2600’s video chip, called TIA, and integrate computer-like capabilities such as text generation.

After many iterations, the new chip became the CTIA, the graphics integrated circuit at the heart of the new home computer. Then they designed a chip to take the load off the main CPU by feeding graphics data to the CTIA, and that became ANTIC, itself a custom microprocessor. The engineers also added a chip to handle keyboard, paddle input, and four-channel sound, called POKEY. These three custom chips, in league with a 6502 CPU, would form the core of Atari’s home computer architecture.

As a plan developed between 1977 and 1978 from many design meetings, Atari’s engineering team narrowed down the computer to three product options. There would be a low-end machine, nicknamed Candy, that would serve as a game console with an optional keyboard attachment; a high-end machine code-named Colleen with advanced, integrated features and an expansion bus; and a machine with an integrated monitor. They ended up dropping the integrated monitor idea and focusing on Candy and Colleen. Those would become the 400 and 800 computers.

To compete with the Apple II, the higher-end Atari 800 would need peripherals. And that’s where the FCC got in the way. All electronic circuits emanate radio waves when current flows—it’s one of the fundamental properties of electricity. To make sure that TV-related electronics devices don’t degrade TV reception, the FCC tightly regulates the radio frequency (RF) emissions that they can release.

At the time of the Atari HCS’s development, the FCC kept very strict rules on RF interference. Atari wanted an RF output that would allow the 800 to use a regular TV set as a display, but that meant clamping down on the potential expandability of the system. Atari engineers designed thick metal shielding within the 400 and 800 that blocked electromagnetic emanations from its core electronics.

That kept Atari from creating an “open box” type system, similar to the Apple II, where users could plug in any expansion card they wanted. The Apple II avoided FCC issues by not connecting to a TV set directly; instead, Apple allowed a third-party company to provide that as an aftermarket option. As a hobbyist machine, the Apple II could get away with that. The TI-99/4, released in 1979, skirted the RF interference issue by shipping with its own special monitor—a stripped-down TV set.

Texas Instruments lobbied to have the RF interference rules relaxed, and the FCC granted a conditional waiver of the rules in late 1979 (it finally changed them in 1983), but by then it was too late for Atari to simplify the design of its machines before launch.

M.U.L.E., an early Electronic Arts game, combined action, strategy, and economic theory on a planet named Irata (get it?). [Screenshot: courtesy of MobyGames]

Killer app in space

After finishing up design work on the POKEY chip, engineer Doug Neubauer began writing a game for the new computer system in development. It would be a first-person interpretation of his favorite computer space strategy game, Star Trek, which was making the rounds on high-powered mainframes at the time. His game, Star Raiders, included a real-time 3D universe full of alien ships, starports, and meteoroids. Its unique design began to attract attention within the firm.

“The first day I came [to Atari], one of the programmers sat down with me and said, ‘Get a load of this,’ and showed me Star Raiders,” recalls Chris Crawford, who had recently joined Atari as a VCS software developer and was to become a high-profile game creator and software evangelist for the HCS platform. “And that was what blew me away. There was absolutely nothing like it in the world of personal computers. It was just way beyond what anybody would have expected.”

In its time, Star Raiders was as dazzling as video gaming got. [Screenshot: courtesy of Moby Games]
Neubauer sought to realistically model 3D space, and he integrated advanced graphics routines that had never been seen in a PC or home console game. When an enemy ship exploded, the game engine calculated its flotsam as 3D points that could be viewed from any angle, including an aft ship viewpoint as you flew through it. Rich and dramatic sound effects complemented the game’s visual flair to an extent that wasn’t possible on competing home PCs or game consoles at the time.

While playing Star Raiders, you use a joystick to pilot a starship from a first-person cockpit view. A starfield swirls around your viewpoint realistically as you move the stick, but the full breadth of the controls proved too complex for just the one-button Atari joystick to handle. Players can call up a detailed galactic map, change speed, turn on shields, or choose other options by pressing certain keys on the computer keyboard. Upon engaging hyperspace with the H key, your ship’s engines rev up, and stars streak across the screen like the Millennium Falcon in Star Wars.

Star Raiders just blew [Atari cofounder and then CEO] Nolan [Bushnell] and upper management away,” recalls Decuir. (Bushnell was involved with initial plans for the new computers, though he was forced out of Atari by its owner, Warner Communications, around a year before they hit the market.) “They said, ‘Well, we can’t sell the game machine without a keyboard.’ So they came up with this membrane keyboard for the 400. That was our original game machine, but it came out as a minimally functional but useful computer. You needed a keyboard to play Star Raiders.”

Shortly thereafter, the low-end Candy model of the Home Computer System became the Atari 400 that Atari was to release—a sleek, dark beige machine with a completely flat keyboard built in. Though good looking, the keyboard wasn’t fun to type on—but it did let everyone experience Star Raiders. The high-end machine, the 800, would have a conventional, full-travel keyboard more suited to tasks such as word processing.

Even the lower-end Atari 400 offered dazzling multimedia capabilities compared to the Apple II and other first-generation home PCs. [Photo: courtesy of Atari]
Atari kept the 400 and 800 segregated within a new home computing division within Atari. Its VCS game console had just begun to soar on the market, and some within the firm feared that the 400/800 would cannibalize its sales if marketing emphasized the computers’ gaming attributes too keenly. The revolutionary nature of Star Raiders completely disrupted that plan.

In an era energized by 1977’s blockbuster film Star Wars, Atari’s new space game provided an engrossing mixture of action, strategy, and simulation unlike any that came before. Shortly after their launch, people began buying Atari 400 and 800 machines solely to play Star Raiders. It became the killer app for the Atari computers and remained the game to beat for at least two years into the HCS’s lifespan.

In 1981, Mike Moffitt, a Pennsylvania newspaper journalist, described Star Raiders as “the Cadillac of home video games” and “the most sophisticated of all home video games.” He also noted the high price of the systems required to play the game but concluded it was worth it.

Just as the Atari 400/800 soared thanks to Star Raiders, the competing Apple II—then the main target of Atari marketing—became a breathtaking success due to business applications such as VisiCalc. For a time, Atari charged ahead with the serious business angle for its home PCs, reluctant to fully and publicly accept the platform’s deep video game capabilities. It created an unusual dissonance noticed by the press and consumers alike.

“Atari all along struggled with its identity,” said former Atari employee Dale Yocum in a 2014 interview with the ANTIC Atari podcast. “Atari was a game company, and people identified it as a game company. But Atari really wanted to be a personal computer company. And it was hard to convince a Fortune 1000 company that ‘Hey, what you really want to do is buy a bunch of Atari computers and put them on everybody’s desk.'”

Despite the huge gaming draw, many dedicated owners did use their Atari 800s as serious computers for productivity tasks and telecommunications. But with a 40-column text display, slow serial-based peripherals, and limited expandability, the Atari 800 wasn’t the most efficient machine for the job. (By the mid-to-late 1980s, my dad kept an Apple IIc right next to our Atari 800. The Atari reigned for gaming, while the IIc pulled duty as 80-column word processor and spreadsheet machine.)

According to Crawford, Atari wasn’t too upset about the tepid reception of their new product line as a “serious” machine—it was rolling in the dough from video game sales. A cost-saving quirk of the 2600 video chip design allowed Atari’s creative programmers to extend the console’s lifespan far beyond what Atari’s engineers expected, resulting in more sophisticated games and greater sales by 1979.

“That Christmas, the VCS was so successful that they gave a huge bonus to all the programmers,” says Crawford. “And so, the feeling was, ‘Wow, we are on the right track.'”

The golden age of indie software

After the 400 and 800 launched, power users awed by Star Raiders proved eager to flex the machine’s advanced capabilities. But Atari, following its closed model with the 2600, had never intended to spill the secrets of the HCS architecture outside of special agreements with contracted developers. Crawford recalls, “There were about half a dozen people I knew who’d been bugging me for that information, and I had told them, ‘No, I can’t tell you anything.'”

The Atari 400’s flat keyboard frustrated typists but didn’t prevent it from being a superb game console. [Photo: Flickr user Michael Dunn]
Software for the HCS came slowly. At first, Atari retained only one programming department for both the VCS and HCS lines. As Atari’s main breadwinner, VCS software took precedence. “The rule was you cannot do anything on the HCS until you’ve had one game completed for the VCS,” says Crawford. But the easier-to-program HCS became an attractive target. “The general sentiment in the programming department was, ‘I want to move to the HCS as soon as possible.'”

The executive preference for 2600 software put an internal chokehold on Atari 400/800 program development, especially in terms of games, which Atari management frowned upon. Crawford recalls making a presentation to Atari marketing about a new educational simulation about energy policy (later called Energy Czar). “I went through the presentation, and at the end, the VP of marketing fixed me with a cold stare and asked, ‘Is this a game?,'” he remembers. “I hastily replied, ‘No, no, it’s an educational simulation.’ He looked at me warily and said, ‘I don’t know; it sure looks like fun to me.'”

Atari’s Chris Crawford in a humorous personal shot taken by his wife recalling his WWII-themed Eastern Front game. [Photo: courtesy of Atari]
Up until that point, all software for the Atari’s 2600 game console had been published by Atari. But times were changing in the video game industry. In 1979, a group of disgruntled star software developers left Atari and founded Activision, which would later release blockbuster titles for the VCS. Some of the programmers, such as David Crane and Al Miller, had been responsible not only for most of Atari’s hit 2600 titles but also for writing the operating system and several games and applications for the new 400/800 platform. Although many talented programmers remained at Atari, the loss of top game design talent proved a setback for Atari’s internal software development capacity.

In 1980, things began to shift. After considering the demand from independent developers, the Activision exodus, and the success of Apple’s large and vibrant third-party software market, Atari executives reversed its closed-platform home computer policy. Crawford received the news with joy and contacted developers. “I got on the phone, called them all up, and said, ‘Well, guess what? Where do I mail the documentation to?'”

The Atari Program Exchange feels like an early, mail-order-based version of the iOS App Store.”

As a productivity machine, Atari had lost valuable time in the market with a slim suite of primitive applications (mostly developed internally), although an Atari version of the original spreadsheet, VisiCalc, did land in late 1980. By early 1981, the size of the HCS software library paled in comparison to those of machines from Apple and Radio Shack. A 1981 review of the Atari 800 in

InfoWorld, 

about one and a half years into the HCS launch, noted the Atari’s distinct lack of software and called it “an impressive machine that has not yet reached its full computing potential.”

Atari needed software, and fast. To champion developers, Chris Crawford created the Software Development Support Group within Atari. As a first project, it created a user-friendly development bible called De Re Atari (meaning “All About Atari”), which became the de facto guide for Atari computer programming. Crawford also began flying around the country to give in-person two-day seminars about how the Atari 800 worked and how to program it.

On another innovative front, an Atari employee named Dale Yocum petitioned Atari management to start a new division within the firm that would solicit programs from the general public and publish them in low-cost bare-bones packaging under the name Atari Program Exchange (APX).

With APX, authors submitted programs for consideration to Atari. If the firm accepted their creations, authors received a royalty for sales of their product through a quarterly catalog published by Atari. In retrospect, the model feels like an early, mail-order-based version of the iOS App Store.

Thanks to this push for software by people such as Crawford and Yocum, the Atari 800’s software library expanded dramatically in size and quality after 1981, with some of PC gaming’s greatest hits of the golden era originating on the machine. In addition to the groundbreaking Star Raiders, Atari’s HCS played host to seminal masterpieces such as M.U.L.E., The Seven Cities of Gold (both by Dani Bunten Berry and Ozark Softscape), and Archon (Free Fall Associates), all published by a then-new company called Electronic Arts. Text adventure games such as Infocom’s Zork also did well on Atari’s computers.

Atari’s APX arm published games such as Chris Crawford’s ambitious Eastern Front 1941. [Photo: courtesy of Atari]
An indie software market similar to the one that had been flourishing on the Apple II sprung up around the Atari 800. A few early APX titles, such as Caverns of Mars and Eastern Front (1941)—a war game from Crawford himself with revolutionary scrolling map techniques—became breakout hits that sold tens of thousands of units, at a time when that was a big deal.

Game industry legend Sid Meier, the creator of Civilization, began his professional development career at home thanks to Atari’s computer. “When I got my 800, probably the first game I wrote was very similar to Space Invaders,” Meier told me in 2007. “I took it to my local computer store, and they had very little software for sale. I put it on a cassette tape and into a plastic bag. I remember they bought 5 or 10 copies of it.”

Thousands of other small developers would develop games for Atari’s home computers by the end of the decade.

The end of an era

Even though vibrant software flourished on Atari’s home computers in the early 1980s, the platform’s business foundations remained far from certain. Atari’s home computer division remained largely unprofitable, carried along by the success of Atari’s coin-op and home console divisions. At the worst possible time for Atari, competition in the home computer space from the Texas Instruments TI-99/4A and the Commodore VIC-20 began to heat up to fever pitch—just as other competitive factors came together to threaten Atari’s future.

By mid-1982, the Atari 2600 game market resembled a frenzied gold rush. The astounding financial success of Activision inspired dozens of firms, including food manufacturers and media companies, to create their own VCS software. The market became glutted with poor-to-middling quality games. Around that time, American consumers also began to embrace low-cost home PCs for gaming.

Electronic Arts’ Archon was a chess-like game with arcade-style arcade action. [Image: courtesy of Moby Games]
With sales of 2600 hardware and software slowing down, Atari released its long-anticipated follow-up to the VCS, the Atari 5200 SuperSystem, in November 1982. Despite being over five years old, Atari’s HCS architecture remained advanced enough to form the basis of the 5200, which held its own graphically against competing consoles such as Intellivision and ColecoVision. But Atari slipped up. Most 5200 games shipped as slightly enhanced ports of earlier 400/800 titles on larger, incompatible cartridges. Terrible controllers and software incompatibility with the 2600 and 800 held the 5200 back, and a dramatic turn in the video game market the following year sealed the console’s fate.

There was more trouble on the horizon. In August 1982, Commodore released the Commodore 64, a low-cost home computer with a beefy 64K of RAM and advanced graphics that borrowed numerous pages from Atari’s playbook. It also benefited from lower raw materials cost (due to relaxed FCC rules, not as much RF shielding was required) and lower-priced chips. Like the Atari 800, the C64 included a keyboard, played games on cartridge or disk, used Atari-style joysticks, and even included a serial IO bus for disk drives and other accessories.

To undercut Texas Instruments and Atari, Commodore began a price war that dropped the cost of home PCs from $500-$1000 per machine to unsustainable $50-$200 levels by mid-1983. Earlier that year, Atari launched the lean and stylish Atari 1200XL computer, which remained largely compatible with Atari 400/800 software but shipped with 64K of RAM. With an $899 price and no significant new features over the cheaper 800, the 1200XL was a dud with both reviewers and consumers.

The Atari 1200XL had an updated look but wasn’t a technological advance over the 800. [Photo: courtesy of Atari]
In the summer of 1983, Atari released yet more iterations of the HCS in the form of the 600XL and 800XL, which replaced the aging 400 and 800 machines. They fared much better with the press (and sold well with consumers once manufacturing quantities rose the following year). But these new machines couldn’t undercut the Commodore 64 on price.

With the 64, Commodore found itself with a hit on its hands, but its scorched-earth success came at a terrible cost to the industry around it. TI pulled out of the market, and Atari sustained heavy losses that coincided with simultaneous losses in the home video game division. Commodore underwent its own round of turmoil, leading to the resignation of founder Jack Tramiel. The market would eventually recover, but the short-term damage was immense.

The troubles at Atari precipitated an investor panic in its parent company, Warner Communications, and before long, Warner began soliciting offers to offload Atari’s consumer hardware divisions. In 1984, Commodore founder Tramiel rounded up a group of investors and bought Atari’s consumer divisions for around $200 million.

Seven Cities of Gold pioneered open-world gaming—with surprisingly evocative graphics for its time. [Image: courtesy of Moby Games]
Once the dust settled and Atari’s consumer divisions changed hands, the new Atari Corporation released several other variations of the same 1977-era 400/800 architecture in 1985: The Atari 65XE and Atari 130XE (the latter of which included 128K of RAM, a first for the platform). Announced alongside the more powerful Motorola 68000-based Atari 520ST, Atari’s 8-bit machines continued to target the low-cost home computer and gaming markets.

The 1979-era Atari HCS technology received one final chance on the video game market as the Atari XE Game System in 1987, but it was too little, too late—Nintendo’s brilliant post-arcade software for the NES made the XEGS’s stale game rehashes pale in comparison.

Throughout the 1980s, Atari’s home computers remained moderately popular entry-level home machines, but they never eclipsed Commodore in market share in the U.S. Despite gaining some success with its XE line in Eastern Europe, Atari formally pulled the plug on all of its 8-bit computer products on January 1, 1992. Tramiel’s version of Atari held on a bit longer, selling newer computers and video game consoles, but reached the end of its run in 1996.

A rich legacy

As the world collapsed around corporate Atari in the 1980s, my brother and I remained blithely unaware of the turmoil. I didn’t learn about “The Great American Video Game Crash” until the 1990s. Our Atari 800 still worked, and we kept enjoying the fun moments it brought us. Endless games of Archon and Salmon Run enriched our lives. It remained our family’s chief game console until we bought an NES in 1988, and even then, we never really put the Atari away; it usually came out every year around Thanksgiving or Christmas.

Friends of the author’s family play an Atari game. [Photo: courtesy of Benj Edwards]
Even today, some 35 years after I first played an Atari 800, I am still discovering amazing new games on the platform. The catalog is deep and full of unique gameplay ideas that weren’t often seen in later 2D console games, and nostalgic hobbyists still develop new games for it. While I own a PlayStation 4, a Nintendo Switch, and a Steam PC, an Atari 800 XL takes pride of place on a desk in my family’s gaming room. My kids love it.

Just before filing this article, I unexpectedly found an old email from my dad, printed out in a binder of my Atari notes. He passed away in 2013, but a decade earlier, I had asked him about our family’s personal computer history. “We bought the Atari 800 about the time you were born,” he wrote. “It cost $1000 (plus another $450 for the disk drive later), which was more than we could afford, and mom was unhappy that I spent the money on it.”

“In retrospect, the Apple II would have been a better long term investment. But also in retrospect, stretching the budget to afford the computer was well worth it since it gave you and your brother valuable skills worth more than money. Mom knew that soon after, of course–she never held a grudge about those purchases.”

Some successes are bigger than business. The Atari home computers were a cultural phenomenon that brought joy to a generation. Thanks, dad—and happy birthday, Atari 800.



from Hacker News https://ift.tt/2EGnMMi