Hacker Newsnew | past | comments | ask | show | jobs | submit | lubujackson's commentslogin

I will just say this trend of "land acknowledgements" feels like the worst flotsam of liberalism - it is performatively solumn, prescriptive "on behalf" of some marginalized people, but ultimately meaningless. It feels like the liberal version of a "thoughts & prayers" response.

It is especially misguided since the acknowledged "true owners" of the land usually didn't subscribe to the concept of land ownership. The whole concept is just weird and I don't understand the objective. To be appreciative of the land's history? Why would this be required in a college course syllabus?

I say this as a liberal in SF who is part Native American - who the hell wanted this?


It can be performative and helps show that one is aligned with prevailing political orthodoxy.

Did you see the other post about this where the guys showed a Flock camera pointed at a playground, so any pedo can see when kids are there and not attended?

Or how it has become increasingly trivial to identify by face or license plate such that combining tools reaches "movie Interpol" levels, without any warrant or security credentials?

If Big Brother surveillance is unavoidable I don't think "everyone has access" is the solution. The best defense is actually the glut of data and the fact nobody is actively watching you picking your nose in the elevator. If everyone can utilize any camera and its history for any reason then expect fractal chaos and internet shaming.


> Did you see the other post about this where the guys showed a Flock camera pointed at a playground, so any pedo can see when kids are there and not attended?

If it's inappropriate for any pedo to see when kids are in a park then certainly it should inappropriate when those pedos just happen to be police officers or Flock employees. The nice thing about the "everyone has access" case is that it forces the public to decide what they think is acceptable instead of making it some abstract thing that their brains aren't able to process correctly.

People will happily stand under mounted surveillance cameras all day long, but the moment they actually see someone point a camera at them they consider that a hostile action. The surveillance camera is an abstract concept they don't understand. The stranger pointing a camera in their direction is something they do understand and it makes their true feelings on strangers recording them very clear.

We might need a little bit of "everyone has access" to convince people of the truth that "no one should have access" instead.


> so any pedo can see when kids are there and not attended?

Sure. It also lets parents watch. Or others see when parents are repeatedly leaving their kids unattended. Or lets you see some person that keeps showing up unattended and watching the kids.

> Or how it has become increasingly trivial to identify by face or license plate such that combining tools reaches "movie Interpol" levels, without any warrant or security credentials?

That already exists and it is run by private companies and sold to government agencies. That’s a huge power grab.

> The best defense is actually the glut of data and the fact nobody is actively watching you picking your nose in the elevator. If everyone can utilize any camera and its history for any reason then expect fractal chaos and internet shaming.

This argument holds whether it is public or not. It is worse if Flock or the government can do this asymmetrically than if anyone can do it IMO, they already have enough coercive tools.


"Or others see when parents are repeatedly leaving their kids unattended."

... which is the expected, default use-case for a playground ...


I didn't want to get into an argument over whether kids should be unattended at playgrounds or not - I don't know where the other poster is front and it seems to be based on age, density, region, etc. Where I grew up it would be weird to stay, in the city I am in it would be weird to leave them.

If you leave your kids unattended at a playground I don't see how the camera changes the risk factor in any meaningful way. Either a pedophile can expect there to be unattended children or not.


It’s anonymity of the viewers combined with mass open-access surveillance that enables an unheard of level of stalking capacity.

Most people don’t like the idea that strangers could easily stalk their child remotely.

It’s the easy of access to surveillance technology that is different. Has nothing to do with the park being safe or not.

Try to think like an evil person with no life and very specific and demonic aims if you’re still having trouble seeing why this would be an issue.


> Try to think like an evil person with no life and very specific and demonic aims if you’re still having trouble seeing why this would be an issue.

That person already has incredible power to stalk and ruin someone's life. Making Flock cameras public would change almost nothing for that person. It fascinates me how fast people jump to "imagine the worst person" when we talk about making data public.

We have the worst people, they're the ones who profit off of it being private, with no public accountability, who don't build secure systems. The theater of privacy is, IMO, worse than not having privacy.


“almost nothing” is doing a lot of heavy lifting in that sentence.

Stalking someone from your desk vs. IRL is a whole different ball game. Not sure why this needs explanation… anyways, the main difference is how easy it do things from your desk. For example, no one see you when you’re stalking someone from your desk. Think of the success of 4chan investigations vs. those in authority to actually do so. It’s empowering.

We live in a world of strangers, and unfortunately a % of those are the type to kill/rape other strangers. Why enable them?

Not sure who else would be empowered by making all public camera accessible at the click of a button, but I’m interested in who you think that population is.

Certainly we can agree most normal folks will not spend their time looking camera feeds of strangers?

I’m fascinated by people who stick to their theoretical principles (‘all data should be public’, etc.) no matter the real world implications, but we all have our own interests :).


once i learned to ride a bike i took myself to the playground

There are sites that index thousands of public live streaming cameras, with search fields where you can just enter "park" and get live cams with kids playing, because people have specifically arranged for those cameras to exist.

If you're that worried about child molesters knowing where the kids are, I've got very bad news for you: https://www.statista.com/statistics/254893/child-abuse-in-th...

Turns out, 95% of the predators already know exactly where the victims are, usually because it's their kid. Probably we want to worry about that a lot more.

Doubly so since, y'know, this only works if the predator lives close enough to act on the information before it changes - so the tiny possibility of a predator, a tiny possibility that they didn't already know this, and a tiny possibility of being able to act on the information...


The "coinciding problems" should be an assumption, not a edge case we reason away. Because black swan events are always going to have cascading issues - a big earthquake means lights out AND cell towers overloaded or out, not to mention debris in streets, etc.

What they need is a "shit is fucked fallback" that cedes control. Maybe there is a special bluetooth command any police or ambulance can send if nearby, like clear the intersection/road.

Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance. To techies and lawyers it may sound impossible, but for normal humans, that certainly sounds better. Like that Mitch Hedberg joke, when an escalator is out of order it becomes stairs. When a Waymo breaks it should become a car.


> Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance.

Do the even have physical controls to do that at this point?

I’ve never been in one so I don’t know how different they are from normal cars today.


The Waymos still have all their normal driver controls. There is a process where law enforcement can enter the vehicle, call Waymo and verify their law enforcement status, and then switch the vehicle into manual mode and drive it as normal.

Here is their instructions for law enforcement in the Waymo Emergency Response Guide:

https://storage.googleapis.com/waymo-uploads/files/first%20r...


Ok. Thanks. I must have been thinking of something else.

Didn’t Google have little self-driving vehicles without controls that were limited to pre-planned routes on non-public roads on their campus?

Obviously a hugely different problem domain.


This is a stupid take if we are talking about an actual "bailout" and not an oversized "grease the palms of my golfing buddies" grift.

The only relation to a bailout is the amount of money. But OpenAI doesn't have any infrastructure risk or systemic risk. There isn't even an industry collapse risk because if OpenAI collapses Google and open source models will happily soak up their users.

Now I could totally see a "Big Beautiful Bailout" happening, but again, that would just be more grift. A bailout is meant to recoup from a mistake. Throw a trillion at OpenAI and you... increase their runway a couple of years?


I think it taxes your brain in two different ways - the mental model of the code is updated in the same way as a PR from a co-worker updates code, but in a minute instead of every now and then. So you need to recalibrate your understanding and think through edge cases to determine if the approach is what you want or if it will support future changes etc. And this happens after every prompt. The older/more experienced you are, the harder it is to NOT DO THIS thinking even if you are intending to "vibe" something, since it is baked into your programming flow.

The other tax is the intermittent downtime when you are waiting for the LLM to finish. In the olden days you might have productive downtime waiting for code to compile or a test suite to run. While this was happening you might review your assumptions or check your changes or realize you forgot an edge case and start working on a patch immediately.

When an LLM is running, you can't do this. Your changes are being done on your behalf. You don't know how long the LLM will take, or how you might rephrase your prompt if it does the wrong thing until you see and review the output. At best, you can context switch to some other problem but then 30 seconds later you come back into "review mode" and have to think architecturally about the changes made then "prompt mode" to determine how to proceed.

When you are doing basic stuff all of this is ok, but when you are trying to structure a large project or deal with multiple competing concerns you quickly overwhelm your ability to think clearly because you are thinking deeply about things while getting interrupted by completed LLM tasks or context switching.



oh no I wonder what will happen

You're being facetious I see, but I assume there will be litigation brought up against it being illegal, and then it will be stuck down as blatantly illegal in the next 6 months, and people involved will not get paid and be fired / dissolved.

> Tech Force will primarily recruit early-career technologists

Baking in the ageism right from the start. Few want to work for this government, but at least they can keep those unsavory 30+ year olds out!


> ageism

This is DEI talk to them.


So is hiring young people over a better qualified senior applicant. You wouldn't have to make it a point if recent grads would win in a meritocracy.

It's fine to hire a bunch of juniors but then you're kinda explicitly not looking for the best. But at $150k-ish they'll get mid career and senior devs from low CoL areas pouring in.


Last Day, Capricorn 25s, this is Carousel…

I have some context here, as my dad used to work at a state college running "the systems". There was era of thin clients and a centralized VAX machine or similar that did all the work. I remember weekends where my dad had to work because they were "running the numbers" which involved calculating grades and producing end of semester reports and such. Somehow this took more than a day of processing for a few thousans students and ran on a big tape machine. Sometimes it would crash or something so someone had to be there to keep things moving.

I don't remember all the details, but this is what they used up til the mid-90s. By then, I could probably run something on my 486 home computer that would complete in half an hour. But there were decades of process and customization embedded in these systems.

When modernization happened, it was swift. My dad was lucky with the timing as he was retiring during the transition so even made bonus money coming back as a consultant. But you can imagine that even if the new software was pricey and not as customizable, the speed improvements and reduction in staff made sense.

Once the old staff was cleared out, there was no department of staff being paid to build computer services, only the lesser staff needed to maintain and use it. The issue was that hardware/Internet usage expanded too fast, the importance and reliance on tech grew and it became a selling point for unis to have the newest systems in place.

It makes sense now for the pendulum to swing in the other direction, as customization and cost are wildly out of balance with AI and the latent tech workforce available at every college.

I would say the blocker now is the same as what allowed creaky old systems to persist into the 90s - administration doesn't give a shit about any of this and it is only viewed as a cost center. Until differentiating through customization provides an obvious and immediate fiscal benefit to the admins themselves, most unis won't look at changing off their shitty landlord systems until they are basically forced to by the market.


5th grade, my best friend at the time was in a basketball team, just a small town league for kids. I never really played basketball, so I was planning to watch the game then we'd hang out. It was the first game of the season and my friend was getting his uniform from a table when a dad running things asked me what team I was playing on and I said no, I'm just here to hang out with my friend.

He shook his head and said, "No, that won't do. You're on his team, too" and handed me a jersey. Then he went ahead and paid my registration fee.

More than the money, it was the proactive nature of it that struck me at the time. The thing is, if I had asked my parents they probably would have signed me up. But it was one of those things where it would have never crossed my mind to ask. I ws as one of those kids that needed a push every now and then and rarely got one.

I never got very good at basketball but I never missed a game and had a great time with my friend. So not a tragic or desperate story, but still meaningful to me all these years later.


There is a million ways where that interaction goes sideways and becomes a drama between the parents nowadays.

It's the kids that seem to appreciate it more than the parents, funnily enough.

I agree with the replies to this saying that the fact it could lead to drama should not prevent people doing things like this, but I can see this causing trouble/resentment too.

I think a lot of the other unasked for examples given could also cause resentment. Perhaps often the right thing to do is just taking the risk.


Things have always been able to go wrong. That's not a reason to stop doing things. Oh no, you might get an ear full from an angry parent once in a while. boo hoo.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: