I'll need to dig up a source but I recently heard about this company and, apparently, before offering gigs they do a credit report to determine how much debt the person is carrying (i.e. how desperate they are) and they use that information to _round down_ the hourly rate they offer them.
In the unlikely event that there are any negative consequences for this breach, they deserve every bit of them and more.
I don't remember the source, but I believe I listened to a podcast on an "uber for nurses" (not sure if it was this place), but they do all sorts of nasty things that really shaft the nurses. ISTR that the nurses when they get called in, have to be running a phone app that tracks them, and if they get stuck in traffic or lose cell signal, they get demerits. They pretty much do anything they can to give the nurses a demerit, and demerits cause your pay to go down.
So they're pretty much taking the existing terrible nursing environment in healthcare, and weaponizing it. Nurses already have too many patients and not enough CNAs, on top of 12 hour shifts, needing to do charting after those 12 hours. Healthcare squeezes nurses to the breaking point. Data point: my wife is a nurse.
Well yes, but more so it's how I expect a shitty and perversely structured industry that makes boatloads of money perpetuating a variety of huge barriers to entry to treat the employees who have the least barriers to entry protecting them.
I think I heard the same Podcast - not only do the Apps try and discover the minimum rate a Nurse might take, they’ll actively attempt to manipulate the circumstances of Nurses who were in a strong position so they too end up more dependent and exploitable.
Thanks. This is definitely the source I was referring to.
However, as it applies to my parent comment, the companies mentioned were: Shiftkey, Shiftmed and Carerev. I do not see ENSHYFT mentioned, so I stand corrected.
What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
But when it comes to private markets and semi-private negotiations that same sentiment doesn't easily transfer. Does society benefit in some unique way for allowing asymmetries in labor negotiations, private markets like Uber, or B2C relations like Robinhood (1,2)?
1. https://www.sec.gov/newsroom/press-releases/2020-321
2. Note, Robinhood was fined not for front-runniny customers, just for falsely claiming customers received quality orders. I suspect theyve only stopped the latter behavior.
> broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong
I don't think that's true at all. Companies and individuals negotiate all the time with information the other party doesn't have. Insider trading is about fairness on public markets so every negotiating party of the same type has the same information, and is quite specific to that.
> What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
Insider trading is not about fairness. It’s about theft. If you overhear someone in a public place talking about an upcoming merger, you can trade on it.
Incentive wise you're probably a lot better off if your own broker is front running you than if a HFT desk at a liquidity provider firm is doing it since the broker is at least in a position to kick some of that back to you in the form of reduced fees or whatever.
I agree with you. I'd go further and suggest that candidates should get anonymized information about applicants in the pool. Nothing like negotiating with yourself for a job...
> What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong.
They do? I’m quite happy when I have more information than the party I am negotiating with.
Do you tell your customers all of the input costs of the product or service you sell? I doubt it.
Also, certain parties that trade in public markets have way more information than any retail investor could ever hope to have, hedge funds buy satellite imagery of parking lots, track oil tankers at sea, etc to gain an edge.
Insider trading rules are meant to prevent the public bagholding stocks from the management team having insider information that no other market participant could or should have, there are no rules against legally gathering or purchasing information on your own to gain an edge over other market participants.
It's definitely shady, but it's par for the course. Uber charges you more if you have more gift cards loaded, or just spend more on average in general. You charge what the market will bear.
No, it just hasn’t been possible to differentiate as well before.
One example is biscuit manufacturing, where it’s a fairly open secret that supermarket own brand biscuits are the same product as name brand, because it’s better to capture that segment at a lower margin than to lose it to competition.
Tech now makes it possible to target individuals rather than demographics, but there’s nothing inherently against the status quo in doing so.
The market is an agglomeration of many individuals, meaning that there is no hard and fast rule that you must charge only one price for the entire market; indeed, many custom-priced products exist, enterprise SaaS being one example.
I have a side business and virtually every customer pays a different price, what you’re saying is simply not true. Airlines do it, hotels do it, I have different rates for my customers at my day job, etc.
And even if they pay the same price, they’ll have different costs.
I’ll gladly take all the free alcohol an airline will give me, but other people don’t at all!
I sell some stuff on eBay. If you appear untrustworthy, I’ll spend more for tracking/better tracking on your order so you’ll actually get your stuff faster/more reliably.
There's no such thing as "the market", there are market segments that abstractly represent groups of people with similar characteristics. Charging different prices to people in different segments is standard business practice. Burger chains could charge wealthy individuals $100k per burger if they wanted to, just, burger chains usually have difficulty distinguishing the truly wealthy individuals who walk in the door who would have no trouble putting down that kind of money for a burger.
.... which, in the day and age of facial recognition, gives me an idea for a startup.
Burger chains have at least gotten a start on differentiating their pricing - by raising prices dramatically across the board, and telling anyone who’s frugal or just broke that they can only get discounts (to bring prices slightly lower than today’s pricing, but still a lot more than before) if they use the app. Upper-class people don’t bother with it and pay full price, frugal people take the time to figure out the cheapest way to use one of the current “offers” to assemble a meal.
Upper class people don't bother with it because we all know those discounts are temporary but they'll never let go of the data they extract from those apps and will try to spam you
One can always use a fake email and login account. Upper class people don't bother because they don't eat at fast food chains as often enough as lower class people to warrant needing an app for each one; 99.9% don't give a shit about data collection, only people on HN and other technical fora do.
>One can always use a fake email and login account.
When you're using that fake email be sure to have a burner phone or public internet so they can't link it to your IP, also don't use your computer or any computer you've logged in on so that browser fingerprinting doesn't tag you, also turn off your GPS so they can't geo correlate you.
Of course the rich person is in the same boat, their geolocation will log that they went to Burger King, or their credit card company will snitch on them. Okay, fine, pay with cash, cover your phone in a tin foil faraday cage. Now you also will need to drive a 30 year old car to said establishment since the car manufacturer put a cellular modem and GPS in your car and sells the fact that you went to Burger King to the highest bidder.
That's all I can think of off the top of my head, I'm sure there are dozens of other ways people are tagged. At some level may as well either use the app or just not go.
Pieces of shit. And then they assign you a score for each travel, as if you are really "carpooling" when in reality is a shitty taxi replacement (not that taxis are on a moral high ground, but the point still stands).
We don't need names, we need legislature, and we need to vote for people who will write it, as opposed to grifters who only seek to pad the pockets of billionaires.
These predators aren't scared of name and shame. Any publicity is good publicity (And if it actually gets bad, they'll sue the pants off you.). They are scared shitless of laws censuring their behavior. It's why they fight like mad to ensure that they aren't subject to them.
I’m interested, given the massive nursing shortages, why any nurses were using this service at all? Especially for higher levels, there’s no reason to mess with a shitty app that underpays you, when you should be able to walk into any provider’s office or facility and get hired almost immediately (and for Runs, you even have wide-ranging telehealth options).
This was my thought exactly. There is a giant nursing shortage. I know some nurses who are traveling nurses and they may bank, and they don't need any BS app. (Just want to emphasize, nursing is an incredibly difficult job at the moment, but there are also currently weird dynamics where traveling nurses can actually make a lot more than "stationary" nurses).
Thus, I'm led to believe that nurses using this app have to have some sort of difficulty finding jobs for other reasons, or they're just not informed about their options.
I imagine many of them are people who can't commit to full or even part-time jobs because of responsibilities like childcare or eldercare; their own physical or mental health issues; etc.
You can get paid more as a contractor than an employee.
Some may just want to pick up casual shifts without any obligation on top of their full-time work. This is kinda double dipping because your full time work is paying your benefits, so why work overtime at time and a half for them when you can get 2x+ somewhere else with + pay in lieu of benefits?
Big orgs don’t want to deal with 1000 different individual contractors (especially if it means taking potential misclassification of employee as a contractor) risk.
I think the bigger issue is the myth of nurse fungibility. A rando nurse unfamiliar with your setup/org is unlikely to be very productive.
At scale, the corner cases don't really matter. In aggregate, if it's decently well correlated and readily available, it's probably going to be used.
I can't find it now, but I believe LexisNexis or another large similar reporting/data agency had a product catalog of dozens of products that spit out values for ability to pay, disposable income monthly, annual income, etc.
It makes you feel awful thinking about the direction things are headed. Corporations approaching omniscient regarding all facts of our lives that are reasonably of value to them.
Agreed. And it's not just those -- if you need to pay off debt, you're extra-incentivized to take the highest-paying job, as opposed to one that pays less but is e.g. closer to home, or has a more predictable schedule, or whatever.
The idea that you'd offer less seems... counterproductive to say the least.
It might not be about high pay, it might be about increasing the odds of a nurse dropped into a dysfunctional environment staying there and not bouncing on day 2 or week 2.
In the section of their Privacy Policy titled Data Security [0]:
> We use certain physical, managerial, and technical safeguards that are designed to improve the integrity and security of information that we collect and maintain. Please be aware that no security measures are perfect or impenetrable. We cannot and do not guarantee that information about you will not be accessed, viewed, disclosed, altered, or destroyed by breach of any of our physical, technical, or
managerial safeguards. In particular, the Service is NOT designed to store or secure information that could be deemed to be Protected Health Information as defined by the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”).
IANAL and all that, but I’m not sure you can use the excuse “We didn’t design our system to be HIPAA compliant, sorry,” and hope your liability disappears. Does anyone know?
> I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
It looks like providers accidentally uploaded some PHI.
IANAL so may be wrong, but I worked for a healthcare company. Whether HIPAA applies to them depends on if they are considered a covered entity or a business associate [0].
IMO they aren't bound to HIPAA requirements as a covered entity.
Business associate is a little tricky to determine. But business associates have to sign a BAA (Business Associate Agreement). And I doubt they would have signed one if they have that in their privacy policy.
Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant..
> Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant.
You seem to imply using GMail is a bad thing? I think GMail, when appropriately configured to handle PHI, is probably a million times more secure than some crappy bespoke "enterprise" app.
It isn't that hard to setup a secure SFTP server to automate the exchange. But then again this is a post about configuring a S3 Bucket with public access for SSNs.
The issue with Gmail is sending to the wrong email, sending to a broad email list, having people download it to their local machines. And the amount of PHI being transmitted in these files is larger than this s3 bucket.
>It isn't that hard to setup a secure SFTP server to automate the exchange
When you've got a trickle of information coming and going from hundreds or thousands of other individuals working at tens or hundreds of other entities it is.
You'd eventually wind up developing the kind of ridiculous "secure messaging and file drop" type service that every megabank builds on top of their SFTP and ticketing systems for that purpose. That stuff ain't cheap to run and keep running.
Better to just start with a solution that's 99% there.
HIPAA only applies to a very specific entity called a "covered entity". At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
That being said, HIPAA isn't even relevant here because "ESHYFT" is just a provider a labor. No different than a big consultant providing staff augmentation services.
> At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
Again, HIPAA continues to be the most colloquially misunderstood law out there.
The rule that makes providers "covered entities" isn't really about insurance, it's about whether they transmit specific HIPAA "transactions" electronically. Now, yes, most of these transactions having to do with providers are thing like claim submissions or pre-authorizations to insurance. But there are other reasons a provider may need/want to send a HIPAA transaction electronically.
My point is that there isn't some sort of "loophole" where providers that don't accept insurance are somehow being sneaky. The whole point of the HIPAA security rule is to protect PHI when it is transferred around to different entities in the healthcare system. If the information is going just between you and your doctor, HIPAA isn't relevant, and that is by design.
> it's about whether they transmit specific HIPAA "transactions" electronically.
That's correct, but if you don't accept insurance then you will not transmit anything that meets the criteria to be covered by HIPAA. At least, in terms of being a provider. Things are different if you're a health plan or clearing house.
I spent a lot of time and money questioning this with lawyers at a health tech startup I previously worked at. The underlying reality is nearly the entire US healthcare system falls under HIPAA because nearly everyone wants to accept insurance. However, if you're a doctor running a cash-only business you will not be a covered entity, even if you send PHI electronically.
HIPAA doesn't care about your POS TOS. It either applies or does not.
That said, it's both less broad and more toothless than I'd like. If FB convinces you to install a tracking pixel (like button) stealing your private medical data, they likely haven't violated any laws. At most you'd be able to file a claim against the person who created the leak.
Not a lawyer and all that, but for TFA I don't think HIPAA would be a valid way to try to limit your losses. It's a bit closer to what would happen if you (a doctor) uploaded patient data to Google Drive and then somehow leaked that information (one of Google's contractors disclosing it, a hack, whatever). Nothing about ESHYFT's offerings requires or would be benefited by the data HIPAA protects, and (ignoring incompetence and other factors) I'd be as surprised to see my health data leaked there as I would to see a YT video going over my last lab reports because of some hospital's actions.
They could still be liable for all sorts of other damages (and maybe somebody can convince a court of a HIPAA violation), but it's not an easy HIPAA win.
If you partner with a healthcare provider to provide any sort of technical services, you will be required to sign a BAA (Business Associates Agreement), which makes you similarly liable to the HIPAA & HITECH acts.
>With persons or organizations (e.g., janitorial service or electrician) whose functions or services do not involve the use or disclosure of protected health information, and where any access to protected health information by such persons would be incidental, if at all.
Based on the context from the article of the PHI uploaded being incidental, it would probably fall under this exception. It sounds like ESHYFT isn't meant to be storing any PHI based on the privacy policy above.
The PII of the nurses being accidentally shared by a staffing agency isn't a HIPAA violation. Yes the nurses are providers but their relationship with the Uber for nurses service isn't a medical provider relationship. It's definitely a legal and ethical failing but I don't think it's a HIPAA one.
This is what I took away from the reading. It's basically a shift/employee management platform. The only reason we're even discussing HIPAA is because health care industry adjacent.
If you replaced nurses with gig workers and uber for nurses with something like WeWork this would just be like every other leak we talk about on HN.
HIPAA avoidance is much narrower than that. Entities which perform administrative or managerial duties on behalf of a mandated organization that have to transmit PII to provide that service are also covered, even if the entity itself isn't a provider.
If 'Uber for nurses' is acting on behalf of nurses, it probably doesn't apply? If it's acting on behalf of the hospitals (who are indisputably covered entities), then the situation is much less clear.
I encountered a similar situation with my startup many years ago and decided "better safe than sorry" after consulting the lawyer.
I used to work in the field. HIPAA protects patient data, not provider data. If my understanding is correct that only nurse PII was leaked, this has nothing to do with HIPAA.
In general, I've found that people tend to think HIPAA applies much, much more than it actually does. Like people thinking if you're in a meeting at work with clients and say "Sorry, Bob couldn't be here today, he's got the flu" that that's a HIPAA violation. No, it's not.
This is just an employee data leak, just like a bajillion other employee data leaks. The fact that the employees happen to be nurses still doesn't mean it has anything to do with HIPAA.
ESHYFT isn't a covered entity, so HIPAA doesn't apply to them. Even if they have health data of their employees in their system, they're still not a covered entity.
Really, "Uber for Nurses" is a title to drum up interest. "Large Staffing Service" would be factually accurate.
>I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
The title is exaggerating what the article says and the article is making a big stretch about this being possibly HIPAA covered, I stand corrected, this has nothing to do with HIPAA.
What was leaked was nurses' doctors notes submitted justifying calling out of work. Still a serious leak but nowhere near what is being suggested.
I'm confused because the article lays it out by the 4th paragraph, and you have the right understanding, up until "we're a startup"
Maybe you think the startup maintains patient records?
The article lays out the nurses uploaded them, the provider. This is a temp booking system. The health records were uploaded by the nurses to communicate reasons for absences to their employee and weren't required or requested
They have as much responsibility as Dropbox does. Nurses shouldn't have uploaded them.
Worth mentioning, because the authority level of medical practitioners throws people off. Don't ever give a doctor or practice your Social Security Number. They don't need it. Similarly if they want to check an ID that doesn't mean scan or photograph. Doctors, practices, etc are the worst at infosec. They have no training, basically no penalties if they do something wrong and all of that info is only to follow up in case you don't pay your bill.
In the US, HIPAA is pretty much the strongest privacy legislation there is. There's probably no group that would have a more severe penalty for leaking your info than your healthcare provider.
HIPAA has strict rules with severe penalties, but enforcement is at best spotty. So honest hospitals and doctors offices bend over backwards to comply with the rules at great expense, but bad actors are rarely punished. It's the worst of both worlds. I'm pretty sure that is why the punishments are so harsh, because they need to put the fear of god into practitioners to make them take it seriously since there are so few inspectors.
It's the difference in medical establishment skill level between your doctor and you. You are always at a disadvantage. I've long thought that a disinterested third party needs to be involved. Someone with real oversight taking a position adversarial to the hospital and strictly to create the best possible outcome for the patient.
This is true, however getting it funded is the difficult task.
For it to be effective, the money can't come from the provider, meaning it's either from the payer or the patient. The payer doesn't really care, costs are contained as far as they are concerned, with the various Quality Initiatives. That leaves the patient to sign up for a subscription model.
I explored that as a business 12 years ago, and sadly there is still a need. The worst part is that most clinicians actually want to do the right thing but it's the admins in their organization who set up processes that result in terrible outcomes.
Perhaps true, but the strongest privacy protections in the US are still pretty weak. The biggest penalty I know of is Anthem 2018, where they leaked HIPAA-qualifying records on 80 million customers. Their financial penalty was a whopping... $16 million. Two dimes per affected customer!
It's true that the US rarely penalizes corporations enough to really disincentivize things, but healthcare providers probably take client data security more seriously than just about any other group besides maybe law firms. It's weird to single them out as being particularly unconcerned with and unpenalized for leaks.
HIPAA was designed for portability -- the 'p' standards for portability not privacy -- of health info, so there are immense carve outs in service of that objective. Fines for violating HIPAA are almost non-existent.
HIPAA is wildly misunderstood by the public as a strong safeguard, meanwhile medical offices just get any patient (a captive audience) to sign a release waiver as part of patient intake ...
They get patients to sign something permitting them to share PHI with other entities like e.g. the lab that runs blood work, not to disclaim liability for leaking it unintentionally.
How many healthcare providers do you know personally who have faced severe penalties for leaking information?
The reality is that for a small doctor/dental/whatever office, there is essentially 0 risk. HIPAA violations that carry significant penalties go to huge hospitals and healthcare companies.
Your neighborhood doctor has to screw up in a major way for an extended period of time to have a minute risk of any consequence.
How much information do you think your neighborhood PCP is “leaking” compared to, say, Elevance? This is such a goofy take. Are you expecting that every small provider group is just firing your data off on Facebook every Tuesday, and somehow, no one cares? They’re all using certified EMRs. They all take security seriously because their licenses are literally on the line. Do you work in healthcare?
If they provably expose your data, and you report them, they will get fined. Or they would have last year, who knows if those people still have jobs.
In my experience, no one has ever asked it when booking, just when you fill out forms on your first visit. I always leave it blank (and most other things that don't pertain to my healthcare issue) blank and have never been hassled.
I also always ask for a paper copy of the disclosures to sign, saying that "I don't sign blank checks" when asked to sign the electric pad. I've never had an issue with them printing it out, letting me sign, and them scanning it in.
Healthcare "security"/"authentication" is just "protected" by your name and date of birth which is easily discovered for anyone online.
Not in my case, I do not provide my Social Security Number to (new to me) healthcare providers from small practices to major hospitals with different branches, either.
I'll need to dig up a source but I recently heard about this company and, apparently, before offering gigs they do a credit report to determine how much debt the person is carrying (i.e. how desperate they are) and they use that information to _round down_ the hourly rate they offer them.
In the unlikely event that there are any negative consequences for this breach, they deserve every bit of them and more.
I don't remember the source, but I believe I listened to a podcast on an "uber for nurses" (not sure if it was this place), but they do all sorts of nasty things that really shaft the nurses. ISTR that the nurses when they get called in, have to be running a phone app that tracks them, and if they get stuck in traffic or lose cell signal, they get demerits. They pretty much do anything they can to give the nurses a demerit, and demerits cause your pay to go down.
So they're pretty much taking the existing terrible nursing environment in healthcare, and weaponizing it. Nurses already have too many patients and not enough CNAs, on top of 12 hour shifts, needing to do charting after those 12 hours. Healthcare squeezes nurses to the breaking point. Data point: my wife is a nurse.
Isn't this exactly what you'd expect from an Uber for (somethign)?
Garbage company, garbage culture, garbage business model.
Well yes, but more so it's how I expect a shitty and perversely structured industry that makes boatloads of money perpetuating a variety of huge barriers to entry to treat the employees who have the least barriers to entry protecting them.
Several of my family members were or have been nurses for decades and your wife’s experience mirrors the experiences I’ve seen from that distance.
And I’ve heard “it used to be so much worse”.
The American healthcare system is fairly well broken from virtually every angle.
I think I heard the same Podcast - not only do the Apps try and discover the minimum rate a Nurse might take, they’ll actively attempt to manipulate the circumstances of Nurses who were in a strong position so they too end up more dependent and exploitable.
this is the presentation that discusses this wage suppression for nurses.
https://pluralistic.net/2025/02/26/ursula-franklin/
Thanks. This is definitely the source I was referring to.
However, as it applies to my parent comment, the companies mentioned were: Shiftkey, Shiftmed and Carerev. I do not see ENSHYFT mentioned, so I stand corrected.
This is abhorrent if true; truly evil behavior.
What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
But when it comes to private markets and semi-private negotiations that same sentiment doesn't easily transfer. Does society benefit in some unique way for allowing asymmetries in labor negotiations, private markets like Uber, or B2C relations like Robinhood (1,2)?
1. https://www.sec.gov/newsroom/press-releases/2020-321 2. Note, Robinhood was fined not for front-runniny customers, just for falsely claiming customers received quality orders. I suspect theyve only stopped the latter behavior.
> broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong
I don't think that's true at all. Companies and individuals negotiate all the time with information the other party doesn't have. Insider trading is about fairness on public markets so every negotiating party of the same type has the same information, and is quite specific to that.
> What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong. Take the stock market for example, insider trading is illegal and you don't often hear calls to reverse these laws.
Insider trading is not about fairness. It’s about theft. If you overhear someone in a public place talking about an upcoming merger, you can trade on it.
Yes... And why is theft bad?
Incentive wise you're probably a lot better off if your own broker is front running you than if a HFT desk at a liquidity provider firm is doing it since the broker is at least in a position to kick some of that back to you in the form of reduced fees or whatever.
Might as well get a pat on the head with your punch in the face if you're going to definitely get punched in the face either way.
I don't disagree with you, but wow that requires a bleak outlook.
That is why it should be mandatory for companies to publish the salary range for a role.
I agree with you. I'd go further and suggest that candidates should get anonymized information about applicants in the pool. Nothing like negotiating with yourself for a job...
> What's interesting is that broadly speaking, people acknowledge that negotiating with asymmetric information is immortal or wrong.
They do? I’m quite happy when I have more information than the party I am negotiating with.
Do you tell your customers all of the input costs of the product or service you sell? I doubt it.
Also, certain parties that trade in public markets have way more information than any retail investor could ever hope to have, hedge funds buy satellite imagery of parking lots, track oil tankers at sea, etc to gain an edge.
Insider trading rules are meant to prevent the public bagholding stocks from the management team having insider information that no other market participant could or should have, there are no rules against legally gathering or purchasing information on your own to gain an edge over other market participants.
It's definitely shady, but it's par for the course. Uber charges you more if you have more gift cards loaded, or just spend more on average in general. You charge what the market will bear.
You charge what the market will bear, not the individual.
No, it just hasn’t been possible to differentiate as well before.
One example is biscuit manufacturing, where it’s a fairly open secret that supermarket own brand biscuits are the same product as name brand, because it’s better to capture that segment at a lower margin than to lose it to competition.
Tech now makes it possible to target individuals rather than demographics, but there’s nothing inherently against the status quo in doing so.
Nothing against the status quo. Yes, let’s perpetuate our dystopian nightmare. Good plan.
Didn’t say it was a good plan, just that unless you’ve got some brilliant replacement for late-stage capitalism, it’s a logical progression.
The post you’re replying is an ‘is’ post, not an ‘ought’ post.
The market is an agglomeration of many individuals, meaning that there is no hard and fast rule that you must charge only one price for the entire market; indeed, many custom-priced products exist, enterprise SaaS being one example.
I have a side business and virtually every customer pays a different price, what you’re saying is simply not true. Airlines do it, hotels do it, I have different rates for my customers at my day job, etc.
And even if they pay the same price, they’ll have different costs.
I’ll gladly take all the free alcohol an airline will give me, but other people don’t at all!
I sell some stuff on eBay. If you appear untrustworthy, I’ll spend more for tracking/better tracking on your order so you’ll actually get your stuff faster/more reliably.
There's no such thing as "the market", there are market segments that abstractly represent groups of people with similar characteristics. Charging different prices to people in different segments is standard business practice. Burger chains could charge wealthy individuals $100k per burger if they wanted to, just, burger chains usually have difficulty distinguishing the truly wealthy individuals who walk in the door who would have no trouble putting down that kind of money for a burger.
.... which, in the day and age of facial recognition, gives me an idea for a startup.
Burger chains have at least gotten a start on differentiating their pricing - by raising prices dramatically across the board, and telling anyone who’s frugal or just broke that they can only get discounts (to bring prices slightly lower than today’s pricing, but still a lot more than before) if they use the app. Upper-class people don’t bother with it and pay full price, frugal people take the time to figure out the cheapest way to use one of the current “offers” to assemble a meal.
Upper class people don't bother with it because we all know those discounts are temporary but they'll never let go of the data they extract from those apps and will try to spam you
One can always use a fake email and login account. Upper class people don't bother because they don't eat at fast food chains as often enough as lower class people to warrant needing an app for each one; 99.9% don't give a shit about data collection, only people on HN and other technical fora do.
>One can always use a fake email and login account.
When you're using that fake email be sure to have a burner phone or public internet so they can't link it to your IP, also don't use your computer or any computer you've logged in on so that browser fingerprinting doesn't tag you, also turn off your GPS so they can't geo correlate you.
Of course the rich person is in the same boat, their geolocation will log that they went to Burger King, or their credit card company will snitch on them. Okay, fine, pay with cash, cover your phone in a tin foil faraday cage. Now you also will need to drive a 30 year old car to said establishment since the car manufacturer put a cellular modem and GPS in your car and sells the fact that you went to Burger King to the highest bidder.
That's all I can think of off the top of my head, I'm sure there are dozens of other ways people are tagged. At some level may as well either use the app or just not go.
Yep, that's why it's usually not worth it trying to pretend one has online privacy.
The market ensure (mostly) there is another individual.
Aren’t they just creating a market of 1?
"just" is doing a lot of heavy lifting here
Aren’t they creating a market of 1?
Perfect!
Pieces of shit. And then they assign you a score for each travel, as if you are really "carpooling" when in reality is a shitty taxi replacement (not that taxis are on a moral high ground, but the point still stands).
Game theory transcends basic humanity.
You might be surprised to learn that they're not the only company to do so.
Names. We need names.
Not of companies. Of the people who choose to work for them (or, rather, choose not to stop working for them after they build these "features").
We don't need names, we need legislature, and we need to vote for people who will write it, as opposed to grifters who only seek to pad the pockets of billionaires.
These predators aren't scared of name and shame. Any publicity is good publicity (And if it actually gets bad, they'll sue the pants off you.). They are scared shitless of laws censuring their behavior. It's why they fight like mad to ensure that they aren't subject to them.
> These predators aren't scared of name and shame.
There are exceptions. See the ongoing kerfuffle over "DOGE" employee lists.
It's amazing that, on a cursory look, only 11 states make this practice illegal. The "AI scriptown" is growing.
In America you can get (buy) someone's credit report without their permission?
I’m interested, given the massive nursing shortages, why any nurses were using this service at all? Especially for higher levels, there’s no reason to mess with a shitty app that underpays you, when you should be able to walk into any provider’s office or facility and get hired almost immediately (and for Runs, you even have wide-ranging telehealth options).
This was my thought exactly. There is a giant nursing shortage. I know some nurses who are traveling nurses and they may bank, and they don't need any BS app. (Just want to emphasize, nursing is an incredibly difficult job at the moment, but there are also currently weird dynamics where traveling nurses can actually make a lot more than "stationary" nurses).
Thus, I'm led to believe that nurses using this app have to have some sort of difficulty finding jobs for other reasons, or they're just not informed about their options.
I imagine many of them are people who can't commit to full or even part-time jobs because of responsibilities like childcare or eldercare; their own physical or mental health issues; etc.
Or they have full time jobs already and aren’t generally interested in extra shifts, unless the price is right.
You can get paid more as a contractor than an employee.
Some may just want to pick up casual shifts without any obligation on top of their full-time work. This is kinda double dipping because your full time work is paying your benefits, so why work overtime at time and a half for them when you can get 2x+ somewhere else with + pay in lieu of benefits?
Big orgs don’t want to deal with 1000 different individual contractors (especially if it means taking potential misclassification of employee as a contractor) risk.
I think the bigger issue is the myth of nurse fungibility. A rando nurse unfamiliar with your setup/org is unlikely to be very productive.
That seems like a terrible way to estimate nurse wages.
People have spouses.
People’s parents pay credit cards.
People with bad credit sometimes don’t care.
People have family money.
People with low debt can be desperate for work.
Does it even work?
At scale, the corner cases don't really matter. In aggregate, if it's decently well correlated and readily available, it's probably going to be used.
I can't find it now, but I believe LexisNexis or another large similar reporting/data agency had a product catalog of dozens of products that spit out values for ability to pay, disposable income monthly, annual income, etc.
It makes you feel awful thinking about the direction things are headed. Corporations approaching omniscient regarding all facts of our lives that are reasonably of value to them.
> Corporations approaching omniscient regarding all facts of our lives
People happily give away lot of info voluntarily, for example by paying with a card instead of cash.
But I’d argue they aren’t corner cases.
Most people I know with bad credit aren’t desperate for money. At least not educated, highly paid ones like nurses.
Most just ignore their financial problems in the hope they go away.
Not to mention nurse demand outstrips supply, so they have options and can certainly turn down bad offers.
Agreed. And it's not just those -- if you need to pay off debt, you're extra-incentivized to take the highest-paying job, as opposed to one that pays less but is e.g. closer to home, or has a more predictable schedule, or whatever.
The idea that you'd offer less seems... counterproductive to say the least.
It might not be about high pay, it might be about increasing the odds of a nurse dropped into a dysfunctional environment staying there and not bouncing on day 2 or week 2.
Proper data privacy laws would make this sort of thing nearly impossible
In the section of their Privacy Policy titled Data Security [0]:
> We use certain physical, managerial, and technical safeguards that are designed to improve the integrity and security of information that we collect and maintain. Please be aware that no security measures are perfect or impenetrable. We cannot and do not guarantee that information about you will not be accessed, viewed, disclosed, altered, or destroyed by breach of any of our physical, technical, or managerial safeguards. In particular, the Service is NOT designed to store or secure information that could be deemed to be Protected Health Information as defined by the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”).
IANAL and all that, but I’m not sure you can use the excuse “We didn’t design our system to be HIPAA compliant, sorry,” and hope your liability disappears. Does anyone know?
0: https://eshyft.com/wp-content/uploads/2019/06/ESHYFT-Privacy...
HIPAA applies to patient data not providers data.
> I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
It looks like providers accidentally uploaded some PHI.
IANAL so may be wrong, but I worked for a healthcare company. Whether HIPAA applies to them depends on if they are considered a covered entity or a business associate [0].
IMO they aren't bound to HIPAA requirements as a covered entity.
Business associate is a little tricky to determine. But business associates have to sign a BAA (Business Associate Agreement). And I doubt they would have signed one if they have that in their privacy policy.
Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant..
0: https://www.hhs.gov/hipaa/for-professionals/covered-entities...
> Also just as a side note, HIPAA is not a ideal standard to begin with for security. Many large companies exchange bulk PHI via gmail since it is HIPAA compliant.
You seem to imply using GMail is a bad thing? I think GMail, when appropriately configured to handle PHI, is probably a million times more secure than some crappy bespoke "enterprise" app.
It isn't that hard to setup a secure SFTP server to automate the exchange. But then again this is a post about configuring a S3 Bucket with public access for SSNs.
The issue with Gmail is sending to the wrong email, sending to a broad email list, having people download it to their local machines. And the amount of PHI being transmitted in these files is larger than this s3 bucket.
>It isn't that hard to setup a secure SFTP server to automate the exchange
When you've got a trickle of information coming and going from hundreds or thousands of other individuals working at tens or hundreds of other entities it is.
You'd eventually wind up developing the kind of ridiculous "secure messaging and file drop" type service that every megabank builds on top of their SFTP and ticketing systems for that purpose. That stuff ain't cheap to run and keep running.
Better to just start with a solution that's 99% there.
HIPAA only applies to a very specific entity called a "covered entity". At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
That being said, HIPAA isn't even relevant here because "ESHYFT" is just a provider a labor. No different than a big consultant providing staff augmentation services.
> At a high level, "covered entities" are health care providers that accept insurance or insurers. That's right, there's a massive caveat on "accepts insurance". You can be a healthcare provider and do not have to comply with HIPAA if you don't accept insurance.
Again, HIPAA continues to be the most colloquially misunderstood law out there.
The rule that makes providers "covered entities" isn't really about insurance, it's about whether they transmit specific HIPAA "transactions" electronically. Now, yes, most of these transactions having to do with providers are thing like claim submissions or pre-authorizations to insurance. But there are other reasons a provider may need/want to send a HIPAA transaction electronically.
My point is that there isn't some sort of "loophole" where providers that don't accept insurance are somehow being sneaky. The whole point of the HIPAA security rule is to protect PHI when it is transferred around to different entities in the healthcare system. If the information is going just between you and your doctor, HIPAA isn't relevant, and that is by design.
> it's about whether they transmit specific HIPAA "transactions" electronically.
That's correct, but if you don't accept insurance then you will not transmit anything that meets the criteria to be covered by HIPAA. At least, in terms of being a provider. Things are different if you're a health plan or clearing house.
I spent a lot of time and money questioning this with lawyers at a health tech startup I previously worked at. The underlying reality is nearly the entire US healthcare system falls under HIPAA because nearly everyone wants to accept insurance. However, if you're a doctor running a cash-only business you will not be a covered entity, even if you send PHI electronically.
HIPAA doesn't care about your POS TOS. It either applies or does not.
That said, it's both less broad and more toothless than I'd like. If FB convinces you to install a tracking pixel (like button) stealing your private medical data, they likely haven't violated any laws. At most you'd be able to file a claim against the person who created the leak.
Not a lawyer and all that, but for TFA I don't think HIPAA would be a valid way to try to limit your losses. It's a bit closer to what would happen if you (a doctor) uploaded patient data to Google Drive and then somehow leaked that information (one of Google's contractors disclosing it, a hack, whatever). Nothing about ESHYFT's offerings requires or would be benefited by the data HIPAA protects, and (ignoring incompetence and other factors) I'd be as surprised to see my health data leaked there as I would to see a YT video going over my last lab reports because of some hospital's actions.
They could still be liable for all sorts of other damages (and maybe somebody can convince a court of a HIPAA violation), but it's not an easy HIPAA win.
If you're not a direct health provider, you probably can. Don't take that as an endorsement.
If you partner with a healthcare provider to provide any sort of technical services, you will be required to sign a BAA (Business Associates Agreement), which makes you similarly liable to the HIPAA & HITECH acts.
It depends there are some exceptions.[0]
>With persons or organizations (e.g., janitorial service or electrician) whose functions or services do not involve the use or disclosure of protected health information, and where any access to protected health information by such persons would be incidental, if at all.
Based on the context from the article of the PHI uploaded being incidental, it would probably fall under this exception. It sounds like ESHYFT isn't meant to be storing any PHI based on the privacy policy above.
0:https://www.hhs.gov/hipaa/for-professionals/privacy/guidance...
[Nevermind]
The PII of the nurses being accidentally shared by a staffing agency isn't a HIPAA violation. Yes the nurses are providers but their relationship with the Uber for nurses service isn't a medical provider relationship. It's definitely a legal and ethical failing but I don't think it's a HIPAA one.
This is what I took away from the reading. It's basically a shift/employee management platform. The only reason we're even discussing HIPAA is because health care industry adjacent.
If you replaced nurses with gig workers and uber for nurses with something like WeWork this would just be like every other leak we talk about on HN.
HIPAA avoidance is much narrower than that. Entities which perform administrative or managerial duties on behalf of a mandated organization that have to transmit PII to provide that service are also covered, even if the entity itself isn't a provider.
If 'Uber for nurses' is acting on behalf of nurses, it probably doesn't apply? If it's acting on behalf of the hospitals (who are indisputably covered entities), then the situation is much less clear.
I encountered a similar situation with my startup many years ago and decided "better safe than sorry" after consulting the lawyer.
I used to work in the field. HIPAA protects patient data, not provider data. If my understanding is correct that only nurse PII was leaked, this has nothing to do with HIPAA.
In general, I've found that people tend to think HIPAA applies much, much more than it actually does. Like people thinking if you're in a meeting at work with clients and say "Sorry, Bob couldn't be here today, he's got the flu" that that's a HIPAA violation. No, it's not.
This is just an employee data leak, just like a bajillion other employee data leaks. The fact that the employees happen to be nurses still doesn't mean it has anything to do with HIPAA.
ESHYFT isn't a covered entity, so HIPAA doesn't apply to them. Even if they have health data of their employees in their system, they're still not a covered entity.
Really, "Uber for Nurses" is a title to drum up interest. "Large Staffing Service" would be factually accurate.
This 100%. This needs to be a top level comment.
Ah, doing more than skimming the article
>I also saw what appeared to be medical documents uploaded to the app. These files were potentially uploaded as proof for why individual nurses missed shifts or took sick leave. These medical documents included medical reports containing information of diagnosis, prescriptions, or treatments that could potentially fall under the ambit of HIPAA regulations.
The title is exaggerating what the article says and the article is making a big stretch about this being possibly HIPAA covered, I stand corrected, this has nothing to do with HIPAA.
What was leaked was nurses' doctors notes submitted justifying calling out of work. Still a serious leak but nowhere near what is being suggested.
I'm confused because the article lays it out by the 4th paragraph, and you have the right understanding, up until "we're a startup"
Maybe you think the startup maintains patient records?
The article lays out the nurses uploaded them, the provider. This is a temp booking system. The health records were uploaded by the nurses to communicate reasons for absences to their employee and weren't required or requested
They have as much responsibility as Dropbox does. Nurses shouldn't have uploaded them.
Worth mentioning, because the authority level of medical practitioners throws people off. Don't ever give a doctor or practice your Social Security Number. They don't need it. Similarly if they want to check an ID that doesn't mean scan or photograph. Doctors, practices, etc are the worst at infosec. They have no training, basically no penalties if they do something wrong and all of that info is only to follow up in case you don't pay your bill.
In the US, HIPAA is pretty much the strongest privacy legislation there is. There's probably no group that would have a more severe penalty for leaking your info than your healthcare provider.
HIPAA has strict rules with severe penalties, but enforcement is at best spotty. So honest hospitals and doctors offices bend over backwards to comply with the rules at great expense, but bad actors are rarely punished. It's the worst of both worlds. I'm pretty sure that is why the punishments are so harsh, because they need to put the fear of god into practitioners to make them take it seriously since there are so few inspectors.
It's the difference in medical establishment skill level between your doctor and you. You are always at a disadvantage. I've long thought that a disinterested third party needs to be involved. Someone with real oversight taking a position adversarial to the hospital and strictly to create the best possible outcome for the patient.
The Hippocratic model isn't awesome.
This is true, however getting it funded is the difficult task.
For it to be effective, the money can't come from the provider, meaning it's either from the payer or the patient. The payer doesn't really care, costs are contained as far as they are concerned, with the various Quality Initiatives. That leaves the patient to sign up for a subscription model.
I explored that as a business 12 years ago, and sadly there is still a need. The worst part is that most clinicians actually want to do the right thing but it's the admins in their organization who set up processes that result in terrible outcomes.
In 2025 an oath don't mean shit.
Perhaps true, but the strongest privacy protections in the US are still pretty weak. The biggest penalty I know of is Anthem 2018, where they leaked HIPAA-qualifying records on 80 million customers. Their financial penalty was a whopping... $16 million. Two dimes per affected customer!
It's true that the US rarely penalizes corporations enough to really disincentivize things, but healthcare providers probably take client data security more seriously than just about any other group besides maybe law firms. It's weird to single them out as being particularly unconcerned with and unpenalized for leaks.
We saw ours input PII into a Windows box. The idea that their ActiveX monstrosity has any security is not very persuasive.
ActiveX... haven't read that in a long time...
HIPAA was designed for portability -- the 'p' standards for portability not privacy -- of health info, so there are immense carve outs in service of that objective. Fines for violating HIPAA are almost non-existent.
HIPAA is wildly misunderstood by the public as a strong safeguard, meanwhile medical offices just get any patient (a captive audience) to sign a release waiver as part of patient intake ...
They get patients to sign something permitting them to share PHI with other entities like e.g. the lab that runs blood work, not to disclaim liability for leaking it unintentionally.
PCI-DSS is the strongest, HIPAA is just a rubber stamp
That's not actually law at all. It's part of the contract with payment processors.
How many healthcare providers do you know personally who have faced severe penalties for leaking information?
The reality is that for a small doctor/dental/whatever office, there is essentially 0 risk. HIPAA violations that carry significant penalties go to huge hospitals and healthcare companies.
Your neighborhood doctor has to screw up in a major way for an extended period of time to have a minute risk of any consequence.
How much information do you think your neighborhood PCP is “leaking” compared to, say, Elevance? This is such a goofy take. Are you expecting that every small provider group is just firing your data off on Facebook every Tuesday, and somehow, no one cares? They’re all using certified EMRs. They all take security seriously because their licenses are literally on the line. Do you work in healthcare?
If they provably expose your data, and you report them, they will get fined. Or they would have last year, who knows if those people still have jobs.
Only the young and inexperienced believe the law is enforced when it matters.
And yet the data still seems to leak pretty frequently...
Eh.
Last year the total HIPAA violations fines were less than $9.2 million.
A figure I could find for hospital revenue in the same year which is a good enough proxy for fines vs revenue is about $1.2 trillion.
Which rounding because who cares comes to 0.001% of medical revenue ends up being paid for HIPAA violation fines.
Or the equivalent ratio of about a cup of coffee for a typical enough person per year.
HIPAA needs teeth, what it says you're supposed to do is quite strong, the enforcement of it is pathetic.
What do you do if they refuse to book an appointment without it?
I've never had that happen (sample size ~5). They accept non-citizen patients, so they probably don't make SSN a required field.
(for SSN, never tried to prevent scanning of my ID)
In my experience, no one has ever asked it when booking, just when you fill out forms on your first visit. I always leave it blank (and most other things that don't pertain to my healthcare issue) blank and have never been hassled.
I also always ask for a paper copy of the disclosures to sign, saying that "I don't sign blank checks" when asked to sign the electric pad. I've never had an issue with them printing it out, letting me sign, and them scanning it in.
Healthcare "security"/"authentication" is just "protected" by your name and date of birth which is easily discovered for anyone online.
Find a new provider. I have gone 2 decades without providing my SSN to doctors.
New provider is unrealistic for many in USA. In NYC, maybe easy; in rural WI/KS much less so.
Not in my case, I do not provide my Social Security Number to (new to me) healthcare providers from small practices to major hospitals with different branches, either.
You can just use my SSN: 123-45-6789.
[dead]