Entity

Time filter

Source Type

Pyrmont, Australia

The Australian Communications and Media Authority is an Australian Government statutory authority within the Communications portfolio. The ACMA is tasked with ensuring media and communications works for all Australians. It does this through various legislation, regulations, standards and codes of practice.The ACMA is a 'converged' regulator, created to oversee the convergence of the four 'worlds' of telecommunications, broadcasting, radiocommunications and the internet. The ACMA was formed on 1 July 2005 by a merger of the Australian Broadcasting Authority and the Australian Communications Authority. It is one of only a handful of converged communications regulators in the world. Wikipedia.


Xiong L.,Australian Communications and Media Authority | Xiong L.,University of Sydney | Libman L.,University of Sydney | Libman L.,NICTA | Mao G.,University of Sydney
IEEE Journal on Selected Areas in Communications | Year: 2012

Cooperative communication techniques offer significant performance benefits over traditional methods that do not exploit the broadcast nature of wireless transmissions. Such techniques generally require advance coordination among the participating nodes to discover available neighbors and negotiate the cooperation strategy. However, the associated discovery and negotiation overheads may negate much of the cooperation benefit in mobile networks with highly dynamic or unstable topologies (e.g. vehicular networks). This paper discusses uncoordinated cooperation strategies, where each node overhearing a packet decides independently whether to retransmit it, without any coordination with the transmitter, intended receiver, or other neighbors in the vicinity. We formulate and solve the problem of finding the optimal uncoordinated retransmission probability at every location as a function of only a priori statistical information about the local environment, namely the node density and radio propagation model. We show that the solution consists of an optimal cooperation region which we provide a constructive method to compute explicitly. Our numerical evaluation demonstrates that uncoordinated cooperation offers a low-overhead viable alternative, especially in high-noise (or low-power) and high node density scenarios. © 2006 IEEE. Source


Tanner G.,Australian Communications and Media Authority
Telecommunications Journal of Australia | Year: 2013

The re-farming of spectrum (i.e. its reassignment to services with a higher value), triggered by free-to-air television digitisation and the switch-off of analogue TV, represents a significant microeconomic reform. Digitisation has already transformed the television industry, and analogue TV switch-off will yield a 'digital dividend' for wireless broadband as well as further benefits to television itself. With soaring growth projections for mobile data traffic creating pressure for further spectrum allocations for wireless broadband, lessons learned from the digital dividend process are liable to influence government approaches to spectrum re-farming in future. This article considers the development, over two decades, of an original policy 'blueprint' for TV digitisation in Australia as a case-study in spectrum re-farming. While it is too early to judge its full effects, some preliminary lessons are drawn from the various approaches that have been considered and adopted or discarded along the way. Source


News Article | December 5, 2008
Site: arstechnica.com

It's tough being a government these days; who has the energy to clean up the Internet after a hard day's work bailing out the financial sector? Not the Australian government, it seems. Rather than actually doing something about illegal content, they just make a list of it and tell ISPs to filter everything that's on the list. Sidestepping the murky political details and—for the moment—the civil liberties problems inherent in this approach, let's take a closer look at the technical aspects of such a plan. In the Internet Service Provider Content Filtering Pilot Technical Testing Framework document, the Australian Government Department of Broadband Communications and the Digital Economy provides some details about what it wants ISPs to do in a pilot project. The main part is that ISPs who are interested in participating in the pilot will test solutions for filtering a list of at most 10,000 URLs on a blacklist maintained by the Australian Communications and Media Authority, a regulator not unlike the FCC. "Prohibited online content" includes what you would imagine, but also your garden variety porn (yes, the stuff they broadcast over the air on public TV in the Netherlands), and under special circumstances even R-rated movies. Filtering URLs on the ACMA blacklist is a mandatory part of the pilot, though additional filters that aren't clearly specified are optional. So how would an ISP go about blocking certain URLs from being accessed by its customers? First, a little refresher on how the Internet works. It all starts when a user types or clicks a URL. A browser or other application then looks up the domain name in the URL through the Domain Name Service. This usually happens through a DNS server operated by the ISP, but that's not necessarily the case. URLs can also contain IP addresses, avoiding the need for a DNS lookup altogether. Then, the browser starts sending packets to the IP address returned by the DNS. It is of course the ISP's job to make the packets flow in the right direction using the global routing system. The first place where blocking can happen is on the user's computer. However, unless the Australian government is prepared outlaw open source software and administer all of its resident's computers, this isn't going to work. The next option is the DNS. Filtering in the DNS is doable, and has been done in the past. This should work well for most users, but it doesn't take too much tech savvy to configure an unfiltered DNS server, bypassing the ISP's filters. Another problem with DNS filters is that a single domain name may host both blocked and unblocked URLs. For truly illegal content this usually isn't much of a problem, but a popular technique for dodgy content is to publish content that is prohibited in the target jurisdiction on a big server elsewhere, where the content in question is legal. Then the blockers are faced with the dilemma of whether to block a popular domain or let the offending content through. It's also possible to filter packets. In this case, the workaround is installing a proxy. This is not as easy as configuring different DNS addresses, but it's certainly doable. (The obvious counter action by the government would then be to block the proxies.) However, filtering packets, or making them disappear by manipulating the routing system, has the same problem as DNS-based filters: a single address may host both legal and illegal content. And it's worse in the sense that many different DNS names may resolve to a single IP address. A common technique for hosting questionable content is to use a large number of servers with very different IP addresses and let the DNS cycle through these addresses in quick succession, making it hard to determine which addresses host the content in question. (And maybe throw in Google's addresses once in a while so those get blocked as well?) Another issue is that the government set an upper limit of 10,000 URLs. This gives bad actors an obvious way to defeat the system: simply host prohibited content on more than 10,000 URLs. A DNS-based filter can probably be made to work with arbitrarily big blacklists, but any system that requires firewall rules or routing table entries to block addresses will be limited to something in the order of 10k blocked addresses or address ranges—simply upping the limit won't work because the hardware would get too expensive. A different approach to filtering is to intercept all packets and use deep packet inspection to determine if they're going to or from a blacklisted URL. This has two downsides: it obviously doesn't work for encrypted sessions, and it doesn't scale. Even medium-sized ISPs have many 1Gbps links, and the larger ones have 10Gbps links. At 10Gbps, a router, switch, or firewall has about 400 nanoseconds to decide what to do with a packet—not enough time to run through a list of 10,000 URLs. And that's assuming that the target URL is conveniently present in a single packet, rather than having one half of the URL in one packet and one in another packet—and what happens when the second half is actually transmitted first? So the DPI equipment must do full TCP/IP processing and reconstruct TCP sessions from the packets flowing by. This can (maybe) work at 1Gbps speeds, but even then it requires hefty boxes, of which a big ISP would have to deploy a good number. And did I mention that simply using HTTPS defeats this type of filtering completely? An additional complication here is that the Aussie government is interested in letting users opt out of (part of) the filtering. Users can turn off the porn blacklist, but all Australians will still be subject to a filter on "illegal content." This makes certain types of filtering a lot harder. As long as you have the routing table slots, it's easy to instruct routers to send packets for a certain IP address to the "null" interface so they are filtered. However, this is a binary thing: packets from all users are filtered, or packets from all users are allowed through. Setting up two different filter levels makes everything more difficult. My conclusion: this isn't going to work. There's no way to build a filter box that can filter all the URLs where porn is hosted throughout the Internet. A DNS-based filter that helps naive users avoid being confronted with explicit content would probably work to a certain degree. An IP-based filter for a small amount of very illegal content—that would be the stuff that even the spam hosters in China don't want on their servers—may also work. But anything more ambitious than that is certain to fail; either it won't work very well, or it will bankrupt the ISPs. As for the ISPs, they tend to agree. Here's an idea: if the Australian government actually finds child porn, nuclear bomb making manuals, and the like on the Internet, why not do their best to find the perpetrators and put them behind bars? That way we get to keep our free speech and have less crime and terrorism, rather than less of the former without actually reducing the latter. Then again, imposing restrictions on what local taxpayers can do is a lot easier than tracking down and rounding up international criminals and terrorists, and the filtering plan is moving forward despite the massive and fairly obvious drawbacks.


[by James Pearce] Tom Kennedy, CEO of MediaZoo, is also the chairman of the Australian Digital Content Industry Action Agenda, “an industry-led and Government-supported initiative. Its purpose is to ensure that the digital content industry in Australia reaches its full potential and stays competitive in the global economy.” On the March 13th the report — Unlocking the Potential — was released. I spoke with Tom about the report, and what the digital content industry needs to flourish. Some of the points are specific to Australia, but many are valid for any market — especially those wanting to export to larger markets internationally. “Investment is the biggest barrier (to growth in the industry),” according to Tom. “How we attract investment in the sector, in innovation and content, some of that is tax driven, some is policy driven, some is just being able to better articulate the benefits and the growth potential.” Attracting sufficient investment is a problem in most markets, and sometimes the problem stems from legacy regulations and programs. For example, in Australia there are anomalies in tax treatment for analogue content versus digital content, creating inhibitions for the digital industry. There are also subsidies and grants for content that are unavailable to producers of digital content. One of the things that interests me is how a country regulates digital content in relation to normal content, for example whether the local content standards that are so common in TV broadcasting should be applied to mobile content. The action group did not recommend introducing local content standards for digital content “because the digital content sector is still a sunrise sector, it’s still evolving, you don’t want to burden it with the same kind of production overheads yet — you may evolve to that over time,” said Tom. He said the group purposely stayed away from cultural arguments to focus on industry and economic outcomes such as jobs created and tax revenue generated, which is probably the best idea with the current Australian government. There is a common problem over which organizations will regulate digital and especially mobile content, the media regulator or the telecommunications regulator. Australia solved this by combining the two into the Australian Communications and Media Authority. It’s important to have a strong domestic industry to export digital content, according to Tom. Although you can export digital content anywhere in the world on the internet (assuming you aren’t stymied by having to get the rights from a lot of third-parties), getting the business paradigm right is trickier, and “if you don’t have a strong domestic industry it’s pretty hard to export from a vacuum“. There are large competitors out there such as the US, UK, Canada and Asia pouring money into digital content and services which creates some very strong competition internationally. “Also, this labour force is highly sought after and highly mobile, so it can move where the projects are and where the funding is fairly easily,” said Tom. This creates a problem Australia is familiar with, the so-called “brain drain”, and I’m pretty sure a lot of other nations are familiar with it also. “You can still produce this content from Australia and not necessarily have the value chain exist here.” According to Tom the best thing that governments can do to promote the sector is provide investment certainty. “Especially in some of the new and emerging areas people want to know that the rules aren’t going to change in the middle.” As for the industry, it needs to work together to get the data and statistics to support its arguments. Tom said that it is currently too expensive to produce content specifically for mobile, and TV stations moving into the space weren’t charging for the value of the content but rather minimizing the risk of loss. However distinctions in the medium used to view digital content would become irrelevant over time as consumers will want to transfer content across different devices and view it differently there, so digital rights management issues are very important, especially in terms of interoperability. Another big issue in digital content is restricting viewers to a particular territory, which becomes a huge problem when dealing with international sporting events such as the Olympics or FIFA World Cup. Different companies are given the rights to show the content but only in a specific region or country — not a problem when the media was geographically controlled but with the internet it gets more difficult to restrict viewers to a particular territory. One of the benefits of the iTunes model is that it enforces territoriality. There are ways to control this, such as via IP address or checking a credit card or address, but it’s difficult to be foolproof. We also talked about copyright issues in the digital age and the problem with Australia’s copyright laws, but since we talked that is in the process of being changed. You can download the audio of the interview here (2.56 MB, 22 mins). Or you can stream it here … click on the arrow:


News Article | March 6, 2007
Site: gigaom.com

The Australian Communications and Media Authority has said that several major Australian markets — including Sydney, Adelaide, Darwin, Illawarra, Newcastle, the Gold Coast and the Sunshine Coast — will have poor reception for mobile TV services in some areas, but contrary to some media reports the service will be available. The problem is a technical one stemming from the fact that the ACMA expects mobile TV services to require more transmitters than other TV services because the receiving antennas are lower and smaller. “Those additional repeaters may present a potential source of interference to fixed reception analog and digital television services operating on adjacent spectrum, and their deployment may therefore need to be constrained by the need to protect operating or planned broadcasting services,” the ACMA wrote in a paper on the issue (PDF). I spoke to Donald Robertson from the ACMA, who said this means that the people who get good reception from existing tv transmitters will get good reception from mobile TV service, but on the outer edges of the reception area the reception may fall away. TV broadcasters are currently required to broadcast their signal in both analogue and digital format (to allow consumers time to upgrade their TVs for the change-over) — this requirement will end between 2010 and 2012, at which time the spectrum will become less congested and the ACMA will look at ways to improve reception for mobile TV. I think the big issue will come down to the boundaries of the good reception area for mobile TV… if it’s sufficiently large enough it shouldn’t deter many organizations from bidding to roll out the service. However, they’d have to make the limits clear — the whole point of broadcast mobile TV is to provide better signal quality than mobile TV over 3G networks — if the signal quality isn’t good customers are going to leave, as they are already doing in Europe. Related stories: —Australia’s Mobile TV Fight —Telstra May Go For IPTV, Avoid Spectrum Auction

Discover hidden collaborations