KaiserEFC
Player Valuation: £70m
I’m starting to sound like my old man (something I never thought would happen) but a programmable digital currency is next. Then you’re stuffed.They definitely are.
I dont think anyone can really dispute that.
I’m starting to sound like my old man (something I never thought would happen) but a programmable digital currency is next. Then you’re stuffed.They definitely are.
I dont think anyone can really dispute that.
They can try, but I guarantee it won't happen. Encryption can't be "ended". It's an algorithm and is the foundation of security for most online transactions. Without it, the internet and cyber security ends. They will lean on the big companies, but they've already told the government to do one.They definitely are.
I dont think anyone can really dispute that.
Laws are virtually never based in absolutes. That's the whole point in lawyers really, most of the time they aren't there to argue in pure black and white terms that their client did or didn't do something, they're arguing that the law can be applied in a way that means they should or shouldn't be found guilty. As somebody who's been involved in regulation I can tell you it's virtually impossible to write legislation or regulation in a purely black and white way, there will always be shades of grey.Just based on a few points there, it is quite frightening that UK law is based around things that are not absolutes. Hate speech and misinformation should be black and white as an example but the last few years have shown that it is simply what is being defined as it at the time. There has been plenty of misinformation that is actually true at a later date or hate speech simply not being actual hate speech but I don't like what your saying. Just screams of censoring the internet , by forcing it to either go beyond it's means in cases like this to adhere or simply close down. It would make a lot more sense to age restrict content by proving age than this route.
Laws are virtually never based in absolutes. That's the whole point in lawyers really, most of the time they aren't there to argue in pure black and white terms that their client did or didn't do something, they're arguing that the law can be applied in a way that means they should or shouldn't be found guilty. As somebody who's been involved in regulation I can tell you it's virtually impossible to write legislation or regulation in a purely black and white way, there will always be shades of grey.
Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. There are some things that all services will need to do, and other things that will depend on the risks associated with a particular service, and its size,” says a spokesperson.
Oh definitely. Don't get me wrong, you're the one who would have to deal with the consequences and if you're in any doubt at all as to whether you could be done for it then you're 100% doing the right thing. That's kind of what I mean really, no law is ever going to say 'you can do this, you can't do this', it will always be open to subjective interpretation to some extent and so if you think you could be found the wrong side of that it's just not worth the hassle.Indeed.
Doesn't prevent it being a time bandit to tick boxes and ensure documentation to show those ticked boxes.
In response to a bit of criticism, they're softening their stance a little and there's come ref to more help and support to come.
Ascertaining our size, and risk (current affairs really inflates that) isn't enthralling.
I thought that would be a bit of a workaround too. I figured early days that all we'd need to do is document that.
But it's not just illegal content as we've long defined. There's misinformation, hateful content etc - one member reported another as making hateful posts last month because they were autistic. I disagreed. They've said they'll report that. When that happens in future - I need to ensure I, and the platform are protected/fully compliant. If I can't - what happens then?
I appreciate it’s you who is the one who’s going to have to deal with it (so this is a very easy thing for me to say) but I think you’d be safe on CA provided there was an effective intervention mechanism and you recorded with sensible rationale why a post was or wasn’t removed.
A malicious poster or a malicious campaign (like something around one side or another of the Israel-Palestine thing) could cause grief by reporting to OFCOM if they were unhappy with the decision - but I’d be amazed if they entertained it; they’ll have their hands full with genuinely criminal matters.
1. Small Platforms
- Definition: Platforms with fewer than 1 million UK users.
- Regulatory Impact: These platforms still have obligations, but they are more minimal compared to medium and large platforms. You'll need to:
- Have a clear user safety policy.
- Provide easy reporting mechanisms for harmful content.
- Ensure moderation of illegal content (e.g., hate speech, threats).
It's good for you and the other human rights deniers who support Starmer as you wont have to face the music when it all turns toGood, its a depressing place, full of death, Tories and @davek
Be good to chat some Everton with some good Everton fans like @dylsexicbleu and @emir
Bruh you simped for the Orange clown, stand down marine.It's good for you and the other human rights deniers who support Starmer as you wont have to face the music when it all turns tofor them.
Join the Everton conversation today.
Fewer ads, full access, completely free.