Pyrmont, Australia

The Australian Communications and Media Authority is an Australian Government statutory authority within the Communications portfolio. The ACMA is tasked with ensuring media and communications works for all Australians. It does this through various legislation, regulations, standards and codes of practice.The ACMA is a 'converged' regulator, created to oversee the convergence of the four 'worlds' of telecommunications, broadcasting, radiocommunications and the internet. The ACMA was formed on 1 July 2005 by a merger of the Australian Broadcasting Authority and the Australian Communications Authority. It is one of only a handful of converged communications regulators in the world. Wikipedia.

Time filter

Source Type

— The Public Safety LTE & Mobile Broadband Market: 2016 – 2030 research estimates that annual investments on public safety LTE infrastructure will reach $600 Million by the end of 2016. The market, which includes base stations (eNBs), mobile core and transport networking gear, is further expected to grow at a CAGR of 33% over the next four years. By 2020, these infrastructure investments will be complemented by over 4.4 Million LTE device shipments, including smartphones, rugged handheld terminals and vehicular routers. Complete the Public Safety LTE & Mobile Broadband Market: 2016-2030-Opportunities, Challenges, Strategies & Forecasts of 529 pages is available at Following the Qatar Ministry of Interior’s private 800 MHz LTE network deployment in 2012, multiple private LTE rollouts are underway by security forces throughout the oil rich GCC (Gulf Cooperation Council) region, including the Abu Dhabi and Dubai police forces. Driven by nationwide public safety LTE network rollouts in the United States and South Korea, the North America and Asia Pacific regions will account for nearly 70% of all public safety LTE investments over the next four years. Almost all major LMR industry players are leveraging partnerships with established LTE infrastructure OEMs such as Ericsson, Nokia, Huawei and Samsung, to offer end-to-end LTE solutions. Consolidation efforts are continuing to take place throughout the industry, particularly among the largest LTE infrastructure OEMs and public safety system integrators. The “Public Safety LTE & Mobile Broadband Market: 2016 – 2030 – Opportunities, Challenges, Strategies & Forecasts” report presents an in-depth assessment of the global public safety LTE market, besides touching upon the wider LMR and mobile broadband industries. In addition to covering the business case, challenges, technology, spectrum allocation, industry roadmap, value chain, deployment case studies, vendor products, strategies, standardization initiatives and applications ecosystem for public safety LTE, the report also presents comprehensive forecasts for mobile broadband, LMR and public safety LTE subscriptions from 2016 till 2030. Also covered are public safety LTE service revenues, over both private and commercial networks. In addition, the report presents revenue forecasts for public safety LTE infrastructure, devices, integration services and management solutions. Order a Copy of Report at The report comes with an associated Excel datasheet suite covering quantitative data from all numeric forecasts presented in the report, as well as a list and associated details of over 90 global public safety LTE network commitments (as of Q2’2016). Topics Covered: • Business case for public safety LTE and mobile broadband services, including key benefits and challenges • Technology, economics, trends, commercial commitments and deployment case studies • List of public safety LTE engagements worldwide • Public safety LTE infrastructure, devices and applications • Industry roadmap, value chain and standardization initiatives • Spectrum allocation, deployment models and funding strategies • Profiles and strategies of over 260 ecosystem players including public safety system integrators and LTE infrastructure/device OEMs • TCO analysis of private and commercial public safety LTE deployments • Military and tactical LTE deployments • Public safety LTE base station (eNB) form factor analysis • Exclusive interview transcripts from 5 key ecosystem players: Ericsson, Airbus Defence and Space, Sepura, Aricent and Parallel Wireless • Strategic recommendations for vendors, system integrators, public safety agencies and mobile operators • Market analysis and forecasts from 2016 till 2030 Forecast Segmentation: Market forecasts are provided for each of the following submarkets and their subcategories: Public Safety LTE Infrastructure • Submarkets • RAN (Radio Access Network) • EPC (Evolved Packet Core) and Policy • Mobile Backhaul and Transport • RAN Base Station (eNB) Mobility Categories • Fixed Base Stations • Transportable Base Stations • RAN Base Station (eNB) Cell Size Categories • Macrocells • Small Cells • Transportable RAN Base Station (eNB) Form Factor Categories • NIB (Network-in-a-Box) • VNS (Vehicle Network System) • SOW (System-on-Wheels) • Airborne Platform Public Safety LTE Applications • Submarkets • Video Applications • zGIS, AVLS and Mapping • Mobile VPN Access & Security • CAD (Computer Aided Dispatching) • Remote Database Access • Telemetry and Remote Diagnostics • Bulk Multimedia/Data Transfers • PTT & Voice over LTE • Situational Awareness Applications List of Companies Mentioned: 3GPP (Third Generation Partnership Project), Aaeon, Abu Dhabi Police, Accelleran, AceAxis, ACMA (Australian Communications and Media Authority), Aculab, Adax, ADCOM911 (Adams County Communication Center), ADRF (Advanced RF Technologies), Advantech, Advantech Wireless, Aeroflex, Affarii Technologies, Affirmed Networks, Agile Networks, Airbus Defence and Space and more. SAMPLE COPY of the REPORT AVAILABLE. About Us Market Reports Hub is your one-stop online shop for syndicated industry research reports on 25+ categories and their sub-sectors. It brings the latest in market research across multiple industries and geographies from leading research publishers across the globe. For more information, please visit

Freyens B.P.,University of Canberra | Loney M.,Australian Communications and Media Authority
Telecommunications Policy | Year: 2013

The secondary use of vacant television channels (TV white spaces) and the reallocation of the digital dividend to provide wireless broadband services are in the final stages of implementation in some countries. Originally seen as a once in a generation opportunity to better allocate UHF spectrum, further digital dividends are now underway as regulators and industry strive to meet exponential increases in demand for mobile data services. Concurrent developments suggest that TV white spaces may be rapidly exploited by global networks with billions of supported devices. The potential for sub-optimal outcomes is identified if the prospect of further digital dividends is not taken into account as technical and regulatory arrangements are put in place to allow productive use of TV white spaces. The importance of considering the potential interaction between further digital dividends and the use of TV white spaces is discussed and technical and regulatory approaches to support optimal outcomes are identified. © 2012 Elsevier Ltd.

Freyens B.P.,University of Canberra | Loney M.,Australian Communications and Media Authority
2011 IEEE International Symposium on Dynamic Spectrum Access Networks, DySPAN 2011 | Year: 2011

There has been sustained regulatory support for the development and use of "white space" devices on UHF broadcast spectrum, particularly to provide wireless broadband services on a secondary or "unlicensed" basis. However, as regulators reallocate UHF spectrum released by the digital switchover to new services requiring a high degree of licence certainty (e.g. cellular networks) there will be incompatibilities between the rights of the new licensees and those of unlicensed white space users. What becomes of entrenched secondary usage rights if broadcast spectrum is reallocated to telecommunications and re-licensed on far more exclusive conditions than those currently prevailing for white space devices operating on a secondary basis to broadcasting services? Wide-spread deployment of white space devices could seriously complicate the reallocation of UHF band primary services from broadcasting to higher value users. This article considers Australian regulatory arrangements in light of this issue and suggests licensing reforms required to manage competing white space usage rights in the future. © 2011 IEEE.

News Article | November 20, 2015

That's what the Australian Communications and Media Authority (ACMA) has done this week with a very timely occasional paper on the Internet of Things (IoT). As well as identifying issues of direct concern to the ACMA, the paper also includes an overview of the technology and its capabilities. The IoT is the bringing together of a very large numbers of devices, data and computing power through the internet. The internet at the moment usually has a human at one or both ends of the communication. In the IoT, most communications will have sensors, actuators, databases or cloud-based computing process at either end. It is the linking of data from a large numbers of devices to the tremendous computing power of the cloud that makes the IoT so interesting. Sensor networks and machine-to-machine communication have been around for quite some time now, but has mostly been over the cellular telephony network or over short range, mesh networks such as ZigBee. Generally, the processing of data generated by these networks has been reasonably straightforward, such as pollution monitoring or device tracking. But the linking of these devices to the internet opens up many new possibilities. Large scale deployment of sensor networks will generate vast amounts of data which can be moved via the internet to be processed using the huge resources of cloud computing. There are potential applications in health, aged care, infrastructure, transport, emergency services among others. Terms such as "smart cities" and "smart infrastructure" have been coined to refer to the capabilities of combining large scale sensor networks with cloud computing. So for example, smoke alarms might be integrated with fire services. A rapid increase in the number of alarms may indicate (for example) an explosion in a factory. Data from the alarms along with the sequence and pattern of the alarms might be able to be processed to give information as to the nature, location and extent of the explosion. The ACMA paper has some discussion of projections for the take up of the technology. These seem extraordinary. There is a reference to a recent McKinsey report that estimates worldwide productivity gains of US$11.1-trillion a year by 2025. Catherine Livingstone, chair of Telstra, believes that the changes brought by IoT will dwarf those we saw with the fixed line internet in the mid-1990s and the mobile internet in the mid-2000s. What is even more extraordinary is the expected speed of the take up of these technologies. Cisco expects 50-billion devices to be connected to the internet by 2020 compared to the 15-billion currently connected. There is certainly a great deal of activity in this area and consequently, there is some urgency in making sure that there is a suitable regulatory framework for it. This is what the paper deals with. The paper is an invitation for interested parties to comment on ACMA's plans for the area. The most interesting part of the paper is that describing ACMA's current, medium term and long term IoT focus. Current concerns include availability of spectrum, mobile numbers and information exchange. Spectrum refers to the frequency ranges available for wireless communication of the sensors and actuators attached to the IoT. The precursor to the IoT is Machine to Machine Communications (M2M). This has relied primarily on the mobile telephone network. Back in 2012 ACMA made available a new mobile number range (05) to supplement the existing (04) range. If there is an explosion in the number of devices there may need to be additional number ranges. Short range sensor networks make use of unlicensed spectrum such as that used by Wi-Fi. The paper looks at the suitability of existing unlicensed spectrum arrangements and the possibility of new spectrum in the 6GHz range being made available. It also identifies the emergence of long range communications (such as LoRa) using unlicensed spectrum. The other area is how "harms" can be addressed. In this context "harms" refers to issues related to breaches of privacy, security and other problems that we may not yet understand. Managing "harms" involves the exchange of information between parties. For example, dealing with a computer that is infected by malware may need cooperative behaviour between a number of parties. How will that be done in the IoT world? Longer term concerns identified in the paper include network security and reliability as well as the capabilities of businesses and consumers to manage their devices and information. All in all, the paper is a welcome addition to discussion on an increasingly important area. The ACMA is looking for feedback on the paper which you can do online here before Deecmber 14, 2015. Explore further: An Internet of Things reality check

Xiong L.,Australian Communications and Media Authority | Xiong L.,University of Sydney | Libman L.,University of Sydney | Libman L.,NICTA | Mao G.,University of Sydney
IEEE Journal on Selected Areas in Communications | Year: 2012

Cooperative communication techniques offer significant performance benefits over traditional methods that do not exploit the broadcast nature of wireless transmissions. Such techniques generally require advance coordination among the participating nodes to discover available neighbors and negotiate the cooperation strategy. However, the associated discovery and negotiation overheads may negate much of the cooperation benefit in mobile networks with highly dynamic or unstable topologies (e.g. vehicular networks). This paper discusses uncoordinated cooperation strategies, where each node overhearing a packet decides independently whether to retransmit it, without any coordination with the transmitter, intended receiver, or other neighbors in the vicinity. We formulate and solve the problem of finding the optimal uncoordinated retransmission probability at every location as a function of only a priori statistical information about the local environment, namely the node density and radio propagation model. We show that the solution consists of an optimal cooperation region which we provide a constructive method to compute explicitly. Our numerical evaluation demonstrates that uncoordinated cooperation offers a low-overhead viable alternative, especially in high-noise (or low-power) and high node density scenarios. © 2006 IEEE.

Tanner G.,Australian Communications and Media Authority
Telecommunications Journal of Australia | Year: 2013

The re-farming of spectrum (i.e. its reassignment to services with a higher value), triggered by free-to-air television digitisation and the switch-off of analogue TV, represents a significant microeconomic reform. Digitisation has already transformed the television industry, and analogue TV switch-off will yield a 'digital dividend' for wireless broadband as well as further benefits to television itself. With soaring growth projections for mobile data traffic creating pressure for further spectrum allocations for wireless broadband, lessons learned from the digital dividend process are liable to influence government approaches to spectrum re-farming in future. This article considers the development, over two decades, of an original policy 'blueprint' for TV digitisation in Australia as a case-study in spectrum re-farming. While it is too early to judge its full effects, some preliminary lessons are drawn from the various approaches that have been considered and adopted or discarded along the way.

News Article | December 5, 2008

It's tough being a government these days; who has the energy to clean up the Internet after a hard day's work bailing out the financial sector? Not the Australian government, it seems. Rather than actually doing something about illegal content, they just make a list of it and tell ISPs to filter everything that's on the list. Sidestepping the murky political details and—for the moment—the civil liberties problems inherent in this approach, let's take a closer look at the technical aspects of such a plan. In the Internet Service Provider Content Filtering Pilot Technical Testing Framework document, the Australian Government Department of Broadband Communications and the Digital Economy provides some details about what it wants ISPs to do in a pilot project. The main part is that ISPs who are interested in participating in the pilot will test solutions for filtering a list of at most 10,000 URLs on a blacklist maintained by the Australian Communications and Media Authority, a regulator not unlike the FCC. "Prohibited online content" includes what you would imagine, but also your garden variety porn (yes, the stuff they broadcast over the air on public TV in the Netherlands), and under special circumstances even R-rated movies. Filtering URLs on the ACMA blacklist is a mandatory part of the pilot, though additional filters that aren't clearly specified are optional. So how would an ISP go about blocking certain URLs from being accessed by its customers? First, a little refresher on how the Internet works. It all starts when a user types or clicks a URL. A browser or other application then looks up the domain name in the URL through the Domain Name Service. This usually happens through a DNS server operated by the ISP, but that's not necessarily the case. URLs can also contain IP addresses, avoiding the need for a DNS lookup altogether. Then, the browser starts sending packets to the IP address returned by the DNS. It is of course the ISP's job to make the packets flow in the right direction using the global routing system. The first place where blocking can happen is on the user's computer. However, unless the Australian government is prepared outlaw open source software and administer all of its resident's computers, this isn't going to work. The next option is the DNS. Filtering in the DNS is doable, and has been done in the past. This should work well for most users, but it doesn't take too much tech savvy to configure an unfiltered DNS server, bypassing the ISP's filters. Another problem with DNS filters is that a single domain name may host both blocked and unblocked URLs. For truly illegal content this usually isn't much of a problem, but a popular technique for dodgy content is to publish content that is prohibited in the target jurisdiction on a big server elsewhere, where the content in question is legal. Then the blockers are faced with the dilemma of whether to block a popular domain or let the offending content through. It's also possible to filter packets. In this case, the workaround is installing a proxy. This is not as easy as configuring different DNS addresses, but it's certainly doable. (The obvious counter action by the government would then be to block the proxies.) However, filtering packets, or making them disappear by manipulating the routing system, has the same problem as DNS-based filters: a single address may host both legal and illegal content. And it's worse in the sense that many different DNS names may resolve to a single IP address. A common technique for hosting questionable content is to use a large number of servers with very different IP addresses and let the DNS cycle through these addresses in quick succession, making it hard to determine which addresses host the content in question. (And maybe throw in Google's addresses once in a while so those get blocked as well?) Another issue is that the government set an upper limit of 10,000 URLs. This gives bad actors an obvious way to defeat the system: simply host prohibited content on more than 10,000 URLs. A DNS-based filter can probably be made to work with arbitrarily big blacklists, but any system that requires firewall rules or routing table entries to block addresses will be limited to something in the order of 10k blocked addresses or address ranges—simply upping the limit won't work because the hardware would get too expensive. A different approach to filtering is to intercept all packets and use deep packet inspection to determine if they're going to or from a blacklisted URL. This has two downsides: it obviously doesn't work for encrypted sessions, and it doesn't scale. Even medium-sized ISPs have many 1Gbps links, and the larger ones have 10Gbps links. At 10Gbps, a router, switch, or firewall has about 400 nanoseconds to decide what to do with a packet—not enough time to run through a list of 10,000 URLs. And that's assuming that the target URL is conveniently present in a single packet, rather than having one half of the URL in one packet and one in another packet—and what happens when the second half is actually transmitted first? So the DPI equipment must do full TCP/IP processing and reconstruct TCP sessions from the packets flowing by. This can (maybe) work at 1Gbps speeds, but even then it requires hefty boxes, of which a big ISP would have to deploy a good number. And did I mention that simply using HTTPS defeats this type of filtering completely? An additional complication here is that the Aussie government is interested in letting users opt out of (part of) the filtering. Users can turn off the porn blacklist, but all Australians will still be subject to a filter on "illegal content." This makes certain types of filtering a lot harder. As long as you have the routing table slots, it's easy to instruct routers to send packets for a certain IP address to the "null" interface so they are filtered. However, this is a binary thing: packets from all users are filtered, or packets from all users are allowed through. Setting up two different filter levels makes everything more difficult. My conclusion: this isn't going to work. There's no way to build a filter box that can filter all the URLs where porn is hosted throughout the Internet. A DNS-based filter that helps naive users avoid being confronted with explicit content would probably work to a certain degree. An IP-based filter for a small amount of very illegal content—that would be the stuff that even the spam hosters in China don't want on their servers—may also work. But anything more ambitious than that is certain to fail; either it won't work very well, or it will bankrupt the ISPs. As for the ISPs, they tend to agree. Here's an idea: if the Australian government actually finds child porn, nuclear bomb making manuals, and the like on the Internet, why not do their best to find the perpetrators and put them behind bars? That way we get to keep our free speech and have less crime and terrorism, rather than less of the former without actually reducing the latter. Then again, imposing restrictions on what local taxpayers can do is a lot easier than tracking down and rounding up international criminals and terrorists, and the filtering plan is moving forward despite the massive and fairly obvious drawbacks.

[by James Pearce] Tom Kennedy, CEO of MediaZoo, is also the chairman of the Australian Digital Content Industry Action Agenda, “an industry-led and Government-supported initiative. Its purpose is to ensure that the digital content industry in Australia reaches its full potential and stays competitive in the global economy.” On the March 13th the report — Unlocking the Potential — was released. I spoke with Tom about the report, and what the digital content industry needs to flourish. Some of the points are specific to Australia, but many are valid for any market — especially those wanting to export to larger markets internationally. “Investment is the biggest barrier (to growth in the industry),” according to Tom. “How we attract investment in the sector, in innovation and content, some of that is tax driven, some is policy driven, some is just being able to better articulate the benefits and the growth potential.” Attracting sufficient investment is a problem in most markets, and sometimes the problem stems from legacy regulations and programs. For example, in Australia there are anomalies in tax treatment for analogue content versus digital content, creating inhibitions for the digital industry. There are also subsidies and grants for content that are unavailable to producers of digital content. One of the things that interests me is how a country regulates digital content in relation to normal content, for example whether the local content standards that are so common in TV broadcasting should be applied to mobile content. The action group did not recommend introducing local content standards for digital content “because the digital content sector is still a sunrise sector, it’s still evolving, you don’t want to burden it with the same kind of production overheads yet — you may evolve to that over time,” said Tom. He said the group purposely stayed away from cultural arguments to focus on industry and economic outcomes such as jobs created and tax revenue generated, which is probably the best idea with the current Australian government. There is a common problem over which organizations will regulate digital and especially mobile content, the media regulator or the telecommunications regulator. Australia solved this by combining the two into the Australian Communications and Media Authority. It’s important to have a strong domestic industry to export digital content, according to Tom. Although you can export digital content anywhere in the world on the internet (assuming you aren’t stymied by having to get the rights from a lot of third-parties), getting the business paradigm right is trickier, and “if you don’t have a strong domestic industry it’s pretty hard to export from a vacuum“. There are large competitors out there such as the US, UK, Canada and Asia pouring money into digital content and services which creates some very strong competition internationally. “Also, this labour force is highly sought after and highly mobile, so it can move where the projects are and where the funding is fairly easily,” said Tom. This creates a problem Australia is familiar with, the so-called “brain drain”, and I’m pretty sure a lot of other nations are familiar with it also. “You can still produce this content from Australia and not necessarily have the value chain exist here.” According to Tom the best thing that governments can do to promote the sector is provide investment certainty. “Especially in some of the new and emerging areas people want to know that the rules aren’t going to change in the middle.” As for the industry, it needs to work together to get the data and statistics to support its arguments. Tom said that it is currently too expensive to produce content specifically for mobile, and TV stations moving into the space weren’t charging for the value of the content but rather minimizing the risk of loss. However distinctions in the medium used to view digital content would become irrelevant over time as consumers will want to transfer content across different devices and view it differently there, so digital rights management issues are very important, especially in terms of interoperability. Another big issue in digital content is restricting viewers to a particular territory, which becomes a huge problem when dealing with international sporting events such as the Olympics or FIFA World Cup. Different companies are given the rights to show the content but only in a specific region or country — not a problem when the media was geographically controlled but with the internet it gets more difficult to restrict viewers to a particular territory. One of the benefits of the iTunes model is that it enforces territoriality. There are ways to control this, such as via IP address or checking a credit card or address, but it’s difficult to be foolproof. We also talked about copyright issues in the digital age and the problem with Australia’s copyright laws, but since we talked that is in the process of being changed. You can download the audio of the interview here (2.56 MB, 22 mins). Or you can stream it here … click on the arrow:

News Article | March 6, 2007

The Australian Communications and Media Authority has said that several major Australian markets — including Sydney, Adelaide, Darwin, Illawarra, Newcastle, the Gold Coast and the Sunshine Coast — will have poor reception for mobile TV services in some areas, but contrary to some media reports the service will be available. The problem is a technical one stemming from the fact that the ACMA expects mobile TV services to require more transmitters than other TV services because the receiving antennas are lower and smaller. “Those additional repeaters may present a potential source of interference to fixed reception analog and digital television services operating on adjacent spectrum, and their deployment may therefore need to be constrained by the need to protect operating or planned broadcasting services,” the ACMA wrote in a paper on the issue (PDF). I spoke to Donald Robertson from the ACMA, who said this means that the people who get good reception from existing tv transmitters will get good reception from mobile TV service, but on the outer edges of the reception area the reception may fall away. TV broadcasters are currently required to broadcast their signal in both analogue and digital format (to allow consumers time to upgrade their TVs for the change-over) — this requirement will end between 2010 and 2012, at which time the spectrum will become less congested and the ACMA will look at ways to improve reception for mobile TV. I think the big issue will come down to the boundaries of the good reception area for mobile TV… if it’s sufficiently large enough it shouldn’t deter many organizations from bidding to roll out the service. However, they’d have to make the limits clear — the whole point of broadcast mobile TV is to provide better signal quality than mobile TV over 3G networks — if the signal quality isn’t good customers are going to leave, as they are already doing in Europe. Related stories: —Australia’s Mobile TV Fight —Telstra May Go For IPTV, Avoid Spectrum Auction

Loading Australian Communications and Media Authority collaborators
Loading Australian Communications and Media Authority collaborators