Monday, June 20, 2022

Apple's Feedback Mechanism Is Broken

In November, I purchased a brand-new MacBook Pro, complete with Apple’s fancy new M1 Max processor. This is, without reservation, the best computer I’ve ever owned. It is faster than my beloved iMac Pro, but considerably more portable.

At the time, I was working on what would eventually become MaskerAid. Very quickly upon getting to work on my new computer, I realized that things weren’t working properly on this new machine. After some research, it appeared that some aspects of the Vision Framework were not available on Apple Silicon based Macs.

Apple’s mechanism for providing them feedback is the aptly-named Feedback Assistant (née Radar). It is a full-blown app on macOS/iOS/iPadOS. In fact, if you happen to be on an Apple device, try this link. Radar was a black hole, where issues went to die get marked as duplicates. Feedback Assistant, despite trying to pull the “Xfinity trick”, seems to be the same as it ever was.

Regardless, I filed a Radar Feedback — Apple people, if you happen to read this, it’s FB9738098. When I filed it, back in 3 November 2021, I even included a super-simple sample project to demonstrate the issue.


Apple’s feedback system is fundamentally broken — at least, for everyone who does not work at Apple.

In the roughly 225 days since I filed that feedback, I received precisely zero… well… feedback. Apple is a big company, and surely gets an unimaginable amount of feedback filed every single day. However, I have zero indication that a human has looked at my bug. To me, it went into the black hole, never to return again.

Thankfully, by virtue of my day job, I’ve had the occasion to make the acquaintance of quite a few Apple engineers. I reached out to someone who, let’s just say, should have insight into how to fix my problem. They were very helpful, and very apologetic, but their response was, in my words, “tough shit”.

Sigh.


Fast forward to early this month, and it’s WWDC. One of the best not-so-secret secrets about WWDC is that the labs are where it’s at. You can, from the comfort of your own home, spend ~30 minutes with an Apple engineer that is likely to be intimately familiar with the APIs you’re working with. So, I signed up for a lab to beg for someone to fix my bug.

I didn’t expect much to come of this lab, and I started by telling the engineer I spoke with that I expected it to take just a couple minutes. As I told them, I was just there to beg for them to fix my bug.

The engineer’s response?

“Well, I do think this is only going to be a couple minutes, but it’s better than you think: I have an easy workaround for you!”

🎉


In short, when you make a request of the Vision Framework on an Apple Silicon Mac, it fails every time.

Sample code
/// Asynchronously detects the faces within an image.
/// - Parameter image: Image to detect faces within
/// - Returns: Array of rects that contains faces.
///
/// - Note: The rects that are returned are percentages
///         relative to the source image. For example:
///         `(0.6651394367218018,`
///          `0.527057409286499,`
///          `0.0977390706539154,`
///          `0.1303187608718872)`
static func detectFaces(in image: UIImage) async throws -> [CGRect] {
    typealias RectanglesContinuation = CheckedContinuation<[CGRect], Error>
    
    return try await withCheckedThrowingContinuation { (continuation: CheckedContinuation) in
        guard let cgImage = image.cgImage else {
            print("WARNING: Couldn't get CGImage")
            continuation.resume(throwing: FaceDetectionErrors.couldNotGetCgImageError)
            return
        }
                    
        var retVal: [CGRect] = []
        let request = VNDetectFaceRectanglesRequest { request, error in
            if let error = error {
                print("WARNING: Got an error: \(error)")
//                    continuation.resume(throwing: error)
                return
            }
            
            if let results = request.results as? [VNFaceObservation] {
                retVal.append(contentsOf: results.map(\.boundingBox))
            } else {
                print("WARNING: Results unavailable.")
            }
            
            continuation.resume(returning: retVal)
        }
        
        let handler = VNImageRequestHandler(cgImage: cgImage,
                                            orientation: CGImagePropertyOrientation(image.imageOrientation),
                                            options: [:])
        

        do {
            try handler.perform([request])
        } catch {
            print("ERROR: Request failed: \(error)")
            continuation.resume(throwing: error)
        }
    }
}

The error you receive is as follows:

Request failed: Error Domain=com.apple.vis Code=9 "Could not create inference context" UserInfo={NSLocalizedDescription=Could not create inference context}

Back in my lab, I asked the engineer what they were talking about. As it turns out, I simply needed to add one line, against my instance of VNDetectFaceRectanglesRequest:

request.usesCPUOnly = true

That’s it.

Apparently it will force the Vision Framework to use the CPU and not GPU for its computations. Pretty crummy for a real device, but no problem when you’re just trying things in the Simulator.

Having my problem worked around, in the span of five minutes, with a single-line code change is both delightful and incredibly frustrating.


I got to thinking about this lab again this morning, and I’m pretty upset by it. Ultimately, I got what I wanted, but why couldn’t I have had that OVER TWO HUNDRED DAYS AGO‽ It’s infuriating.

Furthermore, as a parting shot, the engineer asked me if I ever bothered trying to talk to someone by using one of my Tech Support Incidents. The engineer meant it in good faith — they were trying to say that I didn’t have to wait from November → June to get an answer. But in a way, I almost find this more frustrating still.

Why is this the accepted way to get the attention of an engineer? For something as simple as a one-line code change, why are my only two options:

  • Wait for June and hope I get an audience with the right engineer at a lab
  • Use one of my two Technical Support Incidents and hope it’s fruitful… and that I don’t need that one for something else later in the year

Were my problem put on the desk of the right engineer, who was incentivized to provide useful and actionable feedback, it could have been worked around in just a few minutes. I just needed a reply to my feedback with the one-liner.

Unfortunately, Feedback Assistant and Radar are tools for Apple, and they serve Apple’s needs and only Apple’s. They are a complete waste of time for outside developers. I maintain that they are a black hole into which I pour time, effort, sample code, and [often useless] sysdiagnoses. I get nothing in return.

Apple swears up and down that Feedbacks are useful. I’ve been told many times, by many teams, that they also use Feedbacks as a de facto voting mechanism to try to get a pulse on what external developers want. I’ll leave it as an exercise for the reader to think about how utterly broken that is.

Instead, let me make it clear what developers want:

Let’s start there, if you please.



from Hacker News https://ift.tt/uoR16vy

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.