Interesting since the government is in the pocket of the Murdoch empire. Since the blood letting of news and media started well over a decade ago, it's never found a way to catch up to the new world of tech giants controlling what people see.
And to control what people see is to control how they think and who to vote for, and before you know it you run nations who have convinced themselves they are democratic and free thinking.
It's a mistake to think the Australian government is doing this just for the good of its people. This is the same government that wants a sovereign backdoor into every system under the guise of anti-terrorism. The same govenrment who raided journalists from the national broadcaster to stop whistles being blown at the government and also to put the broadcaster back in its place.
So good luck legislating it. Perhaps try untangling some of your own hypocrisy first.
If you want support from Murdoch papers during election cycles you need to give them what they want. Which is for their businesses to be free from competition.
The National Broadband Network was crippled because it was a threat to Foxtel. And now Facebook and Google will need to be crippled because they are a threat to the Murdoch mastheads.
This has been going on since the beginning of time. If it's not tech, it's a newspaper or magazine, spinning words around in print to elicit a response, a very specific one.
Matt Druge, drudgereport, did this today and it's exactly the problem: Headline read BCP agents helped across the border, with hands out, 5 illegals.
That's not how it went down and he left out a lot of important details. It did not stop the barrage of the right-wing internet news article commenters.
If you were from Australia I'd let your whole post go.
Murdoch isn't that powerful. When the media tells people what they want to hear, it's very difficult from the outside to figure out who's the one pulling the shots.
The AFP raided the ABC to ensure accuracy of a report about military action that was released under the reign of a far left management. Now that Ita Buttrose is in charge, you won't see that again.
The Australian people trust their own government several orders of magnitude more than the foreign SV tech companies that put people out of jobs and push around politics which America can barely reign in.
Google, Facebook, Twitter, WhatsApp, Atlassian, and other significant concerned companies, please band together and stage a digital protest.
503 all your services, just for the city of Canberra (where our politicians congregate to shout insults at each other and legislate favours for their corporate donors).
A few days cutoff for Canberra will give enough momentum to bring us back from the current precipice, where many of Australia's software companies are realistically considering options for relocating out of Australia, and shifting all intellectual property to their newly created company headquartered in places like Singapore or Estonia or USA.
And then reopen again in Australia as a satellite subsidiary branch owned or licensed by the newly created foreign-domiciled (and now non-taxpaying) parent.
Let's hope it passes through. As much as companies would like to keep this as their trade secret, I as an individual have right to know what is the company doing with my data, and how exactly it process it and based on what logic interacts with me. Especially since those companies trends to dictate one significant part of the life experience of a person living on this planet.
If that would mean those company would just decide not to operate anymore on my country, I would still consider that as a net win.
With great power comes the great responsibility, and those companies showed that we can't count on then bring honest and open to us unless we force them to.
I don’t think you can outrun this forever. Your Facebook feed, your LinkedIn feed and so on are all build to you specifically and once you reload it, there is no public record of what content has been pushed to you. This can be annoying, like when you spot an interesting LinkedIn post right as you’ve pushed up and you are never able of finding it again. It can also be illegal, like when you’re exposed to political campaigning that isn’t on public record.
Tech may have been able to get away with this for a really long time, but almost every political leadership around the world is gearing up to fight it. The internet and the big tech companies have simply gotten too big to ignore, and maybe it’ll be tough for the first movers, but this isn’t only happening in Australia. The GDPR was Europes first serious response, but it certainly won’t be the last. I think they’ll respond by doing what governments always do, with heavy regulation, and that’s what I think we’re witnessing the start of.
I’m not so sure that’s a bad thing. I prefer openness, and Facebook and Google aren’t transparent when they won’t tell you how their services build your results or who pays for it to look the way it does. I don’t think it’ll really break the internet or tech industry either, people are adaptive and someone will find a way to make a lot of money without selling everyone’s private data.
Maybe that’s not going to be Google or Facebook, but do we really care about that?
Sounds good - as a Canberran, I think I'd really benefit from an involuntary social media detox for a few weeks!
This is a policy area in which it's completely legitimate for government to exercise its powers. A digital economy dominated by a handful of American companies is not healthy for our own software industry or our society.
I'll be looking out for the government's (and Labor's) response to this report - it will be interesting to see which recommendations they do and do not accept.
I see 2 problems with this. The first is that most of the algorithms they’re using are probably based heavily on machine learning, making them inscrutable not just to the general public but also to any experts they have auditing them. Fact is, Google and Facebook probably don’t know how their tech works half the time.
Second, this sets pretty bad precedent. Asking one of these companies to reveal the algorithms they’re using is tantamount to theft. Why should they hand over their golden goose just because a lot of people want it? What’s to stop the same government to take the technology of a smaller company by force later?
If people are interested in protecting the data Google displays, I think a better solution would be to go after who is leaking the information in the first place (e.g. if Google can crawl HIPAA info from another site, that’s the other site’s fault). If consumers willingly hand over their data to Google, on the other hand, that’s on them.
> Asking one of these companies to reveal the algorithms they’re using is tantamount to theft.
Intellectual property is a legislative creature. There was never a "natural law" conception of it and it largely didn't evolve in caselaw either.
That said, Australia has a constitutional guarantee against compulsory acquisition without payment on "just terms"[0] and which is not for a purpose supported by another Parliamentary power.
Such a case might not succeed (a case by tobacco companies against plain packaging was not successful in arguing that it represented an acquisition of their brands and trademarks[1]), but it would almost certainly reach the High Court.
If their Honours ruled that an algorithm or trained model was property within the meaning of the Constitution, it's likely that compulsory acquisition would require payment of billions, perhaps tens of billions, of dollars. At that point it would be easier to regulate it, which would be unlikely to trigger the guarantee.
I am, of course, not a lawyer. But Constitutional law was one of my favourite subjects before dropping out of law school.
There are other motivations for accountability besides consumer protection. The potential for disinformation is unparalleled. And both have proven in recent years that their judgment is not beyond approach. Trust is no longer an option.
The first is that most of the algorithms they’re using are probably based heavily on machine learning, making them inscrutable not just to the general public but also to any experts they have auditing them.
I still think this is going to be a big problem with regards to legislation and safety boards. I'm pretty sure they are not going to accept the black box decision making nature. They see the results of prejudice or unsafe decisions, real or imagined[1], and will probably see lack of ability to explain these actions clearly as hubris on the part of companies. The algorithm did it will not hold as an excuse.
They should hand over whatever they are required to by law if they want to be available in a country that demands they hand things over. This is no different to cooperating with legal warrants and other law enforcement requests.
As to how appropriate the request is, that’s up for debate, but it’s not theft.
I demand public minutes of all meetings at all news outlets, because their decision process is an algorithm that determines the content going out to the public
Exactly, if people can demand this of Google and Facebook it means they can demand this of every single company! People act like FAANG are the only companies with any influence and any user data.
The analogy doesn't quite hold. We can see the outputs of the editorial meetings because they're released to the entire public. Whereas the outputs of Google and Facebook models are tuned to each individual, it would be an incredible privacy violation to share those with the government.
But this creates a problem of incentives. In mass media, bad behaviour is highly observable and more likely to be punished or regulated. For example, Australia has defamation laws that are notoriously slanted against the media. Political communication is illegal without the name and address of the individual who authorised it. And so on.
But when each of us sees a different feed, it is impossible to both easily detect violations of law and to preserve individual privacy. Asking to see the algorithm at least allows for correction of potential biases. I expect however that increasingly, these companies will be required to internalise that regulatory function on pain of heavy fines.
Knowing the algorithms without having the data they operate on is useless.
And even if you had the two, at this point you still wouldn't be able to debug how a result is returned. The code is fairly complicated, there are many signals and quite a bit of ML.
source: worked as search ranking engineer and had quite a bit of difficulty debugging certain controversial results despite having access to everything.
There seems to be a significant incentive to getting data that fits the purpose and run it through these algorithms. It's probably not trivial, but paid experts should be able to deal with it pretty quickly.
It's probably enough to run confined experiments. I think you don't need all the data to derive the most important conclusions.
I know I probably don't speak for the people who are drafting these rules, but personally that doesn't bother me so much. I don't have a specific interest in knowing the computations that produced a particular result, I just want to force some kind of transparency on how this data is being used and stored.
Australia can try, but I don't see this happening. It's not only a trade secret for FB and Google but it would open the ecosystem to a crazy amount of gaming.
I watched the announcement and it seemed like more talk than action. A few new inquiries to report in a year or so and a new unit in the ACCC for big tech companies to capture. Not much new legislation, though.
“This particular branch of the [commission] will be able to be approached by various companies who believe that the algorithms have been misused,”
The lobby of accountants seems to desire approving every algorithm, including AI. I've seen this come up in government circles in Europe. Yearly approval of financial numbers, algorithms, and tech in general. All this is going to do is create more red tape and no real change.
Honest question, why should I care what Google or Facebook does with my data? I am honestly curious, and would love if someone could explain the pros and cons to me.
The main "practical" reason is your personal security. Despite anonymization, the kind of data collected by these companies is generally enough to identify you easily. A bad actor at that company could do you quite a bit of harm. So could a politically hostile government, if it demands the company hand its data over.
And with the exception of HIPAA-controlled medical data in the USA, the company that collected the data is free to do anything it wants. It can resell the data, or store it in 10 places and never delete it. This means the window of risk for your data being compromised and used against you extends far into the future.
There is also the general belief that, if any of this data collection and tracking we're done off-line it would be considered terrifyingly creepy.
So even if you have "nothing to hide", as many of us don't, it's not necessarily about protecting what you have right now. It's basically saying that people have a right to some amount of basic privacy, if only to protect their individual personal freedoms and security. Therefore we should take a principled stand against unmanaged black-box data collection at all times, in the same way that people in the USA try to defend their rights to free speech and press even if it does not necessarily affect them in daily life.
So it's about protecting your personal security now and into the future, as well as taking a principled stand on something that should be considered an individual right.
Edit: in this particular case they are interested in knowing how and why you saw some search results or some advertisement as a result of whatever algorithms are being run on your data. This is probably in response to increasing news coverage of people being unduly influenced by various search and advertising algorithms out there on the Internet. Personally I don't know if I care specifically about the algorithms, I just care what data has been collected, where it is being kept, and how I can have it deleted or not have it be kept in the first place.
In all likelihood if I had access to Google or FB data I could get reams of personal information on ex-s and make their life hell. I could find out secrets about local business owners. I could use the data to imply all kinds of untrue things about said person just for the sake of being a dick.
There's an interesting point Dan Geer made about identities for state-level spying activities. If he were to do something illegal (shall we say) in a place he shouldn't, with the level of information collected across the whole of the network, it would be far more feasible to borrow somebody else's identity for the work than it would be to create a plausible fake person from the ground up.
Our database identities are skins for spies now, you want to be caught in that game?
People might elect a genocidal government that decides to send all people with Trait X to the gulag, and Google or Facebook have data identifying you as having Trait X. Clearly the solution is to give the government even more power so it can stop these companies collecting the data, because without them there's no way the government could conduct mass data colllection on its citizens.
Presently the way things are with how folks get their information with regard to politics, if you don't know the algorithm, you don't have a democracy.
Algorithms can change though. An extreme and made up example might be that Facebook could publish their algorithms, then update them just prior to an election and not disclose that until afterwards due to needing administration time publishing that information.
Obviously I'm not suggesting this would happen in practice but my point is governments having a published document for tech algorithms is essentially meaningless in terms for protecting democracy (instead you'd need other controls in place). If anything, it just makes it even easier for governments to exploit technology for their own personal gain - which is more likely the reason why they want insider knowledge.
> Make no mistake, these companies are among the most powerful and valuable in the world. They need to be held to account and their activities need to be more transparent.
Opensource is as important today as ever. The only way of controlling our own economy, news, even who we date and marry is to know how the algorithms and software that guide our lives works.
All these algorithms are being optimized for things that have nothing to do with our wellbeing. To know what they do is just a step in the right direction.
I would like to see all these tech giants to be forced to use open source solutions that everybody can scrutinize. If we are so dependent on this digital infrastructure to make it transparent is in need. That is also true for software used in the goverments themselves.
Google is google because they have gulped down enormous infrastructure and data, leaving everyone else so far behind they 're impossible to catch.
Facebook is FB because it went into an uberaggressive spree of acquiring users circa 2007 , which paid off massively by creating the biggest network lock-in ever.
My bet is their algorithms are what everyone expects them to be, and no actionable info will come from this legislation. The real, hard question is how to create a nonmonopolistic market in internet media.
This is an obvious attempt to control those algorithms because (a) why else would you want to lift the veil (b) it's not like some super hidden secret sauce: they train models to increase feedback signals and (c) those models are too complicated to figure out their effect unless australian MP's are geniuses. It has the potential to set out a chain of events outside Australia, since both the left and the right everywhere are waving their fists at google/fb. The result could be the first algorithm literally designed by a committee, and could cripple google's revenues. This should teach them a lesson, however, not to cozy up to any governement or take political sides.
A follow up question - once the AU government has the ability to view/audit this information, what would it do with it?
Perhaps there are solutions that could let the government regulate search without violating the business's right to privacy. Regulators probably don't need to know how many layers a neural net has, they need to know what the outputs are from politically important queries.
> Digital-industry group DIGI—which represents Google, Facebook, Twitter and Verizon Media—said lawmakers need to think carefully about unintended consequences that could affect competition and the range of products available to Australian consumers.
Seems a little disingenuous, pretty sure Facebook and Google aren't really concerned about competition, and the range of products available
The group is suggesting that its members will make some of their products unavailable to Australian users if regulations they deem unacceptable are imposed.
It's plausible. Australia's population is about 25 million. Assuming 90% of the population uses Facebook, they're under 1% of Facebook's userbase. That's enough of a loss to care about, but not so much that the company couldn't consider it as an option.
From my read most of it is basically a copy of GDPR. The parts related to algorithm and data transparency where basically:
* Content sites suddenly drop in search rankings creating huge revenue drops. They want to force content aggregators to give ample warning before pushing changes which would influence content creators cash flow.
* SEO is expensive. They want Google and Facebook to make it easier for them.
* Content sites doesn't like AMP since its requirements limits the number of ads they can put on the site and it makes it hard for them to track their users. They want Google to share more user data to make up for it.
In my view these concerns are mostly fine and are unlikely to make a dent in Google or Facebook. I think the most interesting point is this:
* Amend the Competition and Consumer Act 2010 so that unfair contract terms are prohibited (not just
voidable). This would mean that civil pecuniary penalties apply to the use of unfair contract terms in any
standard form consumer or small business contract.
That might actually help news sites in a fair way, contracts like "Google might display snippets from my site for free because otherwise it hurts my bottom line" will probably result in fines for Google.
I’m not sure that this helps the the news sites that much. The ACCC’s site on unfair contracts gives examples such as.
terms that enable one party (but not another) to avoid or limit their obligations under the contract
terms that enable one party (but not another) to terminate the contract
terms that penalise one party (but not another) for breaching or terminating the contract
terms that enable one party (but not another) to vary the terms of the contract.
“Google may display short snippets in exchange for a better position on the index” I don’t reached the type of unfairness that the law covers.
Making the algorithm public won’t make them lose their competitive edge. Its the data and their experts that allow them to use their algorithms in any meaningful way.
[+] [-] redact207|6 years ago|reply
And to control what people see is to control how they think and who to vote for, and before you know it you run nations who have convinced themselves they are democratic and free thinking.
It's a mistake to think the Australian government is doing this just for the good of its people. This is the same government that wants a sovereign backdoor into every system under the guise of anti-terrorism. The same govenrment who raided journalists from the national broadcaster to stop whistles being blown at the government and also to put the broadcaster back in its place.
So good luck legislating it. Perhaps try untangling some of your own hypocrisy first.
[+] [-] threeseed|6 years ago|reply
The National Broadband Network was crippled because it was a threat to Foxtel. And now Facebook and Google will need to be crippled because they are a threat to the Murdoch mastheads.
[+] [-] qrbLPHiKpiux|6 years ago|reply
This has been going on since the beginning of time. If it's not tech, it's a newspaper or magazine, spinning words around in print to elicit a response, a very specific one.
Matt Druge, drudgereport, did this today and it's exactly the problem: Headline read BCP agents helped across the border, with hands out, 5 illegals.
That's not how it went down and he left out a lot of important details. It did not stop the barrage of the right-wing internet news article commenters.
[+] [-] friendlybus|6 years ago|reply
Murdoch isn't that powerful. When the media tells people what they want to hear, it's very difficult from the outside to figure out who's the one pulling the shots.
The AFP raided the ABC to ensure accuracy of a report about military action that was released under the reign of a far left management. Now that Ita Buttrose is in charge, you won't see that again.
The Australian people trust their own government several orders of magnitude more than the foreign SV tech companies that put people out of jobs and push around politics which America can barely reign in.
[+] [-] mmerlin|6 years ago|reply
503 all your services, just for the city of Canberra (where our politicians congregate to shout insults at each other and legislate favours for their corporate donors).
A few days cutoff for Canberra will give enough momentum to bring us back from the current precipice, where many of Australia's software companies are realistically considering options for relocating out of Australia, and shifting all intellectual property to their newly created company headquartered in places like Singapore or Estonia or USA.
And then reopen again in Australia as a satellite subsidiary branch owned or licensed by the newly created foreign-domiciled (and now non-taxpaying) parent.
[+] [-] levosmetalo|6 years ago|reply
If that would mean those company would just decide not to operate anymore on my country, I would still consider that as a net win.
With great power comes the great responsibility, and those companies showed that we can't count on then bring honest and open to us unless we force them to.
[+] [-] moksly|6 years ago|reply
Tech may have been able to get away with this for a really long time, but almost every political leadership around the world is gearing up to fight it. The internet and the big tech companies have simply gotten too big to ignore, and maybe it’ll be tough for the first movers, but this isn’t only happening in Australia. The GDPR was Europes first serious response, but it certainly won’t be the last. I think they’ll respond by doing what governments always do, with heavy regulation, and that’s what I think we’re witnessing the start of.
I’m not so sure that’s a bad thing. I prefer openness, and Facebook and Google aren’t transparent when they won’t tell you how their services build your results or who pays for it to look the way it does. I don’t think it’ll really break the internet or tech industry either, people are adaptive and someone will find a way to make a lot of money without selling everyone’s private data.
Maybe that’s not going to be Google or Facebook, but do we really care about that?
[+] [-] ajdlinux|6 years ago|reply
This is a policy area in which it's completely legitimate for government to exercise its powers. A digital economy dominated by a handful of American companies is not healthy for our own software industry or our society.
I'll be looking out for the government's (and Labor's) response to this report - it will be interesting to see which recommendations they do and do not accept.
[+] [-] vgetr|6 years ago|reply
Second, this sets pretty bad precedent. Asking one of these companies to reveal the algorithms they’re using is tantamount to theft. Why should they hand over their golden goose just because a lot of people want it? What’s to stop the same government to take the technology of a smaller company by force later?
If people are interested in protecting the data Google displays, I think a better solution would be to go after who is leaking the information in the first place (e.g. if Google can crawl HIPAA info from another site, that’s the other site’s fault). If consumers willingly hand over their data to Google, on the other hand, that’s on them.
[+] [-] jacques_chester|6 years ago|reply
Intellectual property is a legislative creature. There was never a "natural law" conception of it and it largely didn't evolve in caselaw either.
That said, Australia has a constitutional guarantee against compulsory acquisition without payment on "just terms"[0] and which is not for a purpose supported by another Parliamentary power.
Such a case might not succeed (a case by tobacco companies against plain packaging was not successful in arguing that it represented an acquisition of their brands and trademarks[1]), but it would almost certainly reach the High Court.
If their Honours ruled that an algorithm or trained model was property within the meaning of the Constitution, it's likely that compulsory acquisition would require payment of billions, perhaps tens of billions, of dollars. At that point it would be easier to regulate it, which would be unlikely to trigger the guarantee.
I am, of course, not a lawyer. But Constitutional law was one of my favourite subjects before dropping out of law school.
[0] https://en.wikipedia.org/wiki/Section_51(xxxi)_of_the_Consti...
[1] http://eresources.hcourt.gov.au/showCase/2012/HCA/43
[+] [-] vnchr|6 years ago|reply
[+] [-] protomyth|6 years ago|reply
I still think this is going to be a big problem with regards to legislation and safety boards. I'm pretty sure they are not going to accept the black box decision making nature. They see the results of prejudice or unsafe decisions, real or imagined[1], and will probably see lack of ability to explain these actions clearly as hubris on the part of companies. The algorithm did it will not hold as an excuse.
1) iPod shuffle play randomness
[+] [-] mc32|6 years ago|reply
[+] [-] techdragon|6 years ago|reply
As to how appropriate the request is, that’s up for debate, but it’s not theft.
[+] [-] np_tedious|6 years ago|reply
[+] [-] spunker540|6 years ago|reply
[+] [-] jacques_chester|6 years ago|reply
But this creates a problem of incentives. In mass media, bad behaviour is highly observable and more likely to be punished or regulated. For example, Australia has defamation laws that are notoriously slanted against the media. Political communication is illegal without the name and address of the individual who authorised it. And so on.
But when each of us sees a different feed, it is impossible to both easily detect violations of law and to preserve individual privacy. Asking to see the algorithm at least allows for correction of potential biases. I expect however that increasingly, these companies will be required to internalise that regulatory function on pain of heavy fines.
[+] [-] throwwysranker|6 years ago|reply
And even if you had the two, at this point you still wouldn't be able to debug how a result is returned. The code is fairly complicated, there are many signals and quite a bit of ML.
source: worked as search ranking engineer and had quite a bit of difficulty debugging certain controversial results despite having access to everything.
[+] [-] starbugs|6 years ago|reply
It's probably enough to run confined experiments. I think you don't need all the data to derive the most important conclusions.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] nerdponx|6 years ago|reply
[+] [-] echan00|6 years ago|reply
[+] [-] s_Hogg|6 years ago|reply
[+] [-] synctext|6 years ago|reply
The lobby of accountants seems to desire approving every algorithm, including AI. I've seen this come up in government circles in Europe. Yearly approval of financial numbers, algorithms, and tech in general. All this is going to do is create more red tape and no real change.
Just make online profiling illegal.
[+] [-] newbie578|6 years ago|reply
[+] [-] nerdponx|6 years ago|reply
And with the exception of HIPAA-controlled medical data in the USA, the company that collected the data is free to do anything it wants. It can resell the data, or store it in 10 places and never delete it. This means the window of risk for your data being compromised and used against you extends far into the future.
There is also the general belief that, if any of this data collection and tracking we're done off-line it would be considered terrifyingly creepy.
So even if you have "nothing to hide", as many of us don't, it's not necessarily about protecting what you have right now. It's basically saying that people have a right to some amount of basic privacy, if only to protect their individual personal freedoms and security. Therefore we should take a principled stand against unmanaged black-box data collection at all times, in the same way that people in the USA try to defend their rights to free speech and press even if it does not necessarily affect them in daily life.
So it's about protecting your personal security now and into the future, as well as taking a principled stand on something that should be considered an individual right.
Edit: in this particular case they are interested in knowing how and why you saw some search results or some advertisement as a result of whatever algorithms are being run on your data. This is probably in response to increasing news coverage of people being unduly influenced by various search and advertising algorithms out there on the Internet. Personally I don't know if I care specifically about the algorithms, I just care what data has been collected, where it is being kept, and how I can have it deleted or not have it be kept in the first place.
[+] [-] friendlybus|6 years ago|reply
There's an interesting point Dan Geer made about identities for state-level spying activities. If he were to do something illegal (shall we say) in a place he shouldn't, with the level of information collected across the whole of the network, it would be far more feasible to borrow somebody else's identity for the work than it would be to create a plausible fake person from the ground up.
Our database identities are skins for spies now, you want to be caught in that game?
[+] [-] logicchains|6 years ago|reply
[+] [-] fareesh|6 years ago|reply
[+] [-] laumars|6 years ago|reply
Obviously I'm not suggesting this would happen in practice but my point is governments having a published document for tech algorithms is essentially meaningless in terms for protecting democracy (instead you'd need other controls in place). If anything, it just makes it even easier for governments to exploit technology for their own personal gain - which is more likely the reason why they want insider knowledge.
[+] [-] kartan|6 years ago|reply
Opensource is as important today as ever. The only way of controlling our own economy, news, even who we date and marry is to know how the algorithms and software that guide our lives works.
All these algorithms are being optimized for things that have nothing to do with our wellbeing. To know what they do is just a step in the right direction.
I would like to see all these tech giants to be forced to use open source solutions that everybody can scrutinize. If we are so dependent on this digital infrastructure to make it transparent is in need. That is also true for software used in the goverments themselves.
[+] [-] buboard|6 years ago|reply
Facebook is FB because it went into an uberaggressive spree of acquiring users circa 2007 , which paid off massively by creating the biggest network lock-in ever.
My bet is their algorithms are what everyone expects them to be, and no actionable info will come from this legislation. The real, hard question is how to create a nonmonopolistic market in internet media.
[+] [-] buboard|6 years ago|reply
[+] [-] blululu|6 years ago|reply
[+] [-] whalabi|6 years ago|reply
Seems a little disingenuous, pretty sure Facebook and Google aren't really concerned about competition, and the range of products available
[+] [-] Zak|6 years ago|reply
It's plausible. Australia's population is about 25 million. Assuming 90% of the population uses Facebook, they're under 1% of Facebook's userbase. That's enough of a loss to care about, but not so much that the company couldn't consider it as an option.
[+] [-] username90|6 years ago|reply
https://www.accc.gov.au/system/files/Digital%20platforms%20i...
From my read most of it is basically a copy of GDPR. The parts related to algorithm and data transparency where basically:
* Content sites suddenly drop in search rankings creating huge revenue drops. They want to force content aggregators to give ample warning before pushing changes which would influence content creators cash flow.
* SEO is expensive. They want Google and Facebook to make it easier for them.
* Content sites doesn't like AMP since its requirements limits the number of ads they can put on the site and it makes it hard for them to track their users. They want Google to share more user data to make up for it.
In my view these concerns are mostly fine and are unlikely to make a dent in Google or Facebook. I think the most interesting point is this:
* Amend the Competition and Consumer Act 2010 so that unfair contract terms are prohibited (not just voidable). This would mean that civil pecuniary penalties apply to the use of unfair contract terms in any standard form consumer or small business contract.
That might actually help news sites in a fair way, contracts like "Google might display snippets from my site for free because otherwise it hurts my bottom line" will probably result in fines for Google.
[+] [-] qtplatypus|6 years ago|reply
terms that enable one party (but not another) to avoid or limit their obligations under the contract terms that enable one party (but not another) to terminate the contract terms that penalise one party (but not another) for breaching or terminating the contract terms that enable one party (but not another) to vary the terms of the contract.
“Google may display short snippets in exchange for a better position on the index” I don’t reached the type of unfairness that the law covers.
[+] [-] xxxpupugo|6 years ago|reply
[+] [-] j7ake|6 years ago|reply