You are on page 1of 5

DFCA - Dynamic Frequency and Channel Allocation.

This is a BSC feature which which dynamicaly assigns radio channel for new connections using
the interference estimations from downlink measurement reports combimed with timeslot and
frequency usage information.
The basic idea of DFCA is to provide enough quality in terms of C/I in order to meet the QoS
reqirements. It supports all traffic classes - EFR, HR, AMR.
Main Benefits:
1.By gauranteeing sufficient C/I to the user, the netwrok performance gets significantly improved
in terms of RXQUAL, FER and Dropped Call.
2. It acts as a capacity booster as the use of valuable frequency is dynamically optimised. By
decreasing the frequency reuse distance,DFCA enables to accomodate more TRXs to existing
BTSs without quality degradation.

Smartphones: Are They the Real Deal?

Back to full issue


By Rafael Junquera, Subscribe now
Editorial Director, TeleSemana.com Sponsor or advertise

Business changes are happening at a very rapid pace, and many find it difficult to adapt
when they strike unexpectedly. Changes in the business environment can come from
outside, such as the classic example of a demographic change, or from inside as when a
new technology is integrated within the business organization structure with the intention
to increase efficiency, competitiveness and thus, maximize revenue and profits.

Smartphones could very well fall into the category of self-inflicted business change that
mobile operators are causing to themselves. Their introduction is seen as positive move
and so in the last two years operators have begun an aggressive campaign to introduce
smartphones to their existing subscriber base with high subsidies and entertainment
services to make sure these devices are considered neither a luxury item nor a business
sector device.

Smartphones Change the Game


The iPhone launch in 2007 was, first, a qualitative leap towards a new smartphone
format; its design and functionality was not geared to the corporate sector and although it
came with a high tag price —except when subsidized— it was not seen as a luxury item
but as a necessary commodity. Apple created mass market adoption of smartphones by
forcing other manufacturers to follow its smartphone dogma.

During the past two years, the smartphone market activity has been frenetic —both at the
hardware and software levels— and its progression in the market has been unstoppable.
While conventional devices suffered sales setbacks in 2008 and virtually all of 2009 due
to the global economic crisis, smartphones grew at a rate of over 20 percent. Gartner
estimated late last year that sales of these devices had grown by 23.6 percent in 2009
when compared to 2008, representing 14 percent of total sales for the year.

Forward-looking statements are equally optimistic. Infonetics Research estimates that


smartphones will grow at a compound annual growth rate (CAGR) of 21 percent between
2008 and 2013. The group also says that by 2012 the majority of phones sold will fall
into this category, while as previously mentioned, Gartner estimated that in 2009 they
were only 14 percent of total sales.

Although the trend indicates mature markets are the ones adopting these devices more
quickly, Pyramid Research forecasted just a few months ago that by 2014 150 million
smartphones would be sold in Latin America, 48 million of which will be sold in the year
2014. Pyramid forecasts could even be tagged as conservative, given the degree of
innovation and speed by which smartphone wholesale prices are decreasing. Several
consultants and firms surveyed by TeleSemana.com estimate we are not far from seeing
smartphones with production costs below US$150, thanks to the introduction of open
platforms like Android. In fact, Google’s Nexus One is already around that price point,
according to iSuppli, which estimated its bill of materials to be US$170.

Accommodating Smartphones
But are the 3G network operators ready to accommodate a majority of smartphones? The
answer is a resounding no. If today operators were to have 50 percent of their user base
with smartphones, their 3G networks would collapse—remember that Infonetics
Research estimates that this scenario will happen in 2014 for many operators. Not only
are networks not ready in terms of capacity but neither is the customer care infrastructure
needed to address issues with users walking around with expensive multitasking devices.

Going back to the operators, AT&T has been reporting congestion issues in its 3G
network due to the iPhone, which the operator introduced only two and a half years ago.
The operator is offering free Wi-Fi access to its hotspots to users with Wi-Fi enabled
devices in an attempt to reduce the congestion on its 3G network in some of its major
markets. Unlike other operators that sell the iPhone, AT&T does not allow for the iPhone
"tethering" function, which transforms the iPhone into an access point for Wi-Fi
connectivity to various devices.

On January 14, Vodafone joined the group of operators that are selling the iPhone. The
operator announced in September of last year that the device was going to be in its stores
at the beginning of 2010. It also announced that its network would be prepared to
accommodate the expected increase in data traffic. Last week, seven days after its
introduction, the operator announced that it had sold 100,000 units. Can you imagine the
sudden increase in data traffic Vodafone’s 3G network is being forced to sustain
overnight? And what about its call center?

Some consultancy groups, such as Strand Consult and companies like InnoPath Software,
recently released messages that warned operators about how they could be losing money
with their current smartphone strategy, claiming operators' business structure and
networks are not yet ready for a high data traffic surge from more complex devices.

Experiences like the one by operators such as AT&T show that there is no need to
question the smartphone business in the long term. It seems fair to assume that operators
have a lot of work to do in the next three to four years, taking a proactive approach to
implement a number of key measures to avoid unnecessary headaches.

Undoubtedly one of the obvious mechanisms to prevent dissatisfied users is a massive


investment in next-generation backhaul networks. However, this situation is somewhat
unrealistic, given the financial constraints operators face in order to perform such a
maneuver. Rather, the search should be performed in the operator's existing resources,
such as in their BSS/OSS processes, in order to maximize its service offering.

Users Demand More


Not all users are equal, and personalization is becoming an important issue to maximize
3G network resources. Solutions that enable operators to classify data traffic by type of
user based on their consumption and their location on the network will help to move
resources in the network where they are needed.

On the other hand, Web 2.0 is a concept that cannot be ignored; users are demanding
transparency from the operators and want to be an active part of things like service
development. Operators need to implement mechanisms to communicate with their users
to be able to react faster to their needs and create a strategy that goes beyond having well
trained call centers and retail outlets.

Web 2.0 provides opportunities to create an environment much closer to the end users, to
quickly understand where services fail or what new services are of interest to an
operator’s subscribers. For example, mobile operator O2 has launched an MVNO called
GiffGaff, whose customer service is carried out by its own users in web forums. Thus, the
user gets a lower mobile service price, since the operator does not have to incur the costs
associated with support service.

This shows that the user is gaining and demanding more involvement with its service
providers. That situation, far from being dangerous, creates a huge opportunity for mobile
operators.
4G is Coming: Are You Ready?

Back to full issue


Subscribe now
By Martin Creaner, President and COO, TM Sponsor or advertise
Forum

Even as large parts of the world wait for 3G service and the handsets that go along with
it, 4G is no longer just waiting in the wings. We’ve been seeing real trials taking place,
and by the end of 2010 and into 2011 and beyond, we’ll see real working deployments of
the technology that promises to bring data rates in excess of 100Mbits/sec.

As carriers work feverishly to upgrade their infrastructure, device manufacturers speed


development of handsets to support the new networks and content providers work on
creating apps and other services to take advantage of this huge leap in network capacity,
4G (whether you’re talking about LTE or WiMAX) is not the slam dunk we all assume it
is.

For example, even if the technical specifications say that 4G supports speeds in the range
of 50-100Mbit/sec, the reality is that the bandwidth available to any particular user will
be significantly less than this. Furthermore the backhaul required to bring all of that data
back into the network with high quality of service is just mind-boggling. Carriers simply
aren’t equipped or prepared to make that kind of backhaul investment.

Standard Operating Procedure


Another sticking point for 4G rollouts is the lack of standards. One aspect of standards
that’s near and dear to our hearts at TM Forum is how network equipment communicates
up to the OSS or how the OSS talks down to the network equipment. That translates to
southbound interfaces from the OSS and northbound interfaces from an element manager.

Our Interface Program has developed comprehensive interfaces of this type for many
years and for many technologies, including the TM Forum Integration Framework
interfaces such as MTOSI and OSS/J. We also define the underlying Information
Framework (SID), which underpins the creation of all these interfaces. And these
interfaces are now being adapted to suit the needs of 4G management. But as they say,
timing is everything. If we want to see widespread adoption of our interfaces in the
rollout of 4G, we need to ensure that they are ready when the hardware and software
vendors are creating their products. Vendors tend to be very happy to incorporate
available standards during the design phase of a product. But once the product has been
deployed, it is very difficult to get them to retrofit standards. So making sure that the 4G
interface standards are fit for purpose prior to widespread rollout of products is key. And
that probably means getting them in place during the next 6-9 months.

In practice, that means very quickly upgrading our key interface standards for areas like
network management, fault management, performance management and inventory
management.

I think we can make this deadline because we’ve essentially got the standards – or at least
the bones of the standards – already on the shelf, and what we need to do is make sure
everything is aligned properly.

What helps is that operators are sending us a very clear message that they no longer want
to invest huge amounts of intellectual capital in these interfaces; they just want interfaces
that work. Meanwhile, they plan to focus their increasingly scarce resources on areas that
provide true differentiation.

In fact, we’ve received detailed sets of requirements from operators like Deutsche
Telekom and Vodafone for 4G management, and we’re sharing those with companies
like AT&T and Verizon to flesh them out and rapidly distill them down to a relatively
small number of important requirements for a 4G world. And of course there will not be a
4G world, but more likely a hybrid world where network operators are simultaneously
running 2G, 2.5G, 3G and 4G networks in parallel. And this provides another strong
argument for standards. Adopting the same interface standards for multiple generations of
network equipment and technologies reduces the level of complexity the operation staff
need to handle and reduces the upfront integration costs.

You Can’t Stop Progress


4G is coming; there’s no doubt about that. Already Sprint is touting its 4G network in the
U.S., and others will be making the same claim before too long. But unless we have the
right standards in place, enough backhaul capacity and of course applications and
services that will fully utilize this high-speed mobile broadband technology, 4G could
end up frustrating us all.

You might also like