Separate names with a comma.
You are viewing our forums as a GUEST. Please join us so you can post and view all the pictures.
Registration is easy, fast and FREE!
Discussion in 'GENERAL Wireless Discussion' started by Gabriel, Jul 31, 2004.
AT&T exec: WiMax is the 4G wireless technology
New contracts point to the trend, says CTO Hossein Eslambolchi
By Johan Bostrom, IDG News Service
March 14, 2005
Two corporate customers in New Jersey will have broadband services delivered by AT&T using WiMax technology on a commercial trial basis beginning the first week of May, with plans for full deployment in 2006.
These are trend-setting contracts, according to Hossein Eslambolchi, AT&T's chief technology officer and chief information officer.
"WiMax will take over the 3G networks and become the 4G wireless technology," he said.
A "fairly large retail corporation with distribution across the U.S." and another corporation in Middletown, New Jersey, will be AT&Ts first commercial trials for the emerging last mile wireless broadband access technology.
"We have been testing voice over Internet Protocol services, video, instant messaging and gaming over IP for these customers," Eslambolchi said, not disclosing the companies.
WiMax is a popular term for the Institute of Electrical and Electronics Engineers's (IEEE) 802.16 standard using the 700MHz to 66GHz frequency band delivering 2M bps (bits per second) to 6M bps to each customer within the network's cell radius deployment of up to 2 miles (3.2 kilometers) with the line of sight interrupted. If the line of sight is uninterrupted, the speed can be higher and the radius larger.
The technology is often described both as a competitor and a complement to 3G (third-generation) and wired broadband but Eslambolchi, responsible for AT&T's strategic technology direction, considers WiMax superior to 3G -- and in many ways a substitute for wired broadband.
WiMax combines the benefits that other wireless networking techologies offer individually, addressing the needs of office, home and mobile users, he said.
In addition to the advantages in converging mobility, portability and fixed Internet access, the cost factor is also important for Eslambochi.
"We pay $8.5 billion per year to local exchange carriers to lease capacity to business customers and consumers," said Eslambochi. "With only 7,000 business buildings in the U.S. wired by ourselves, out of 270,000, the local access is crucial -- a battle ground. WiMax becomes the niche play which can lower our cost."
Eslambochi, who serves on the IEEE editorial board of the Journal of Network and Systems Management, notes that there is no world standard for wireless broadband technology except the wireless local area network standard Wi-Fi.
"I personally think that WiMax will end up (being) a worldwide standard," he said
The live customer trial in Middletown will be carried out using Intel (Profile, Products, Articles) Corp. hardware. Intel already announced its participation in WiMax network deployment with carriers in Latin America, China and the U.S., and is developing a wireless broadband chip for WiMax products.
AT&T will continue its live U.S. customer trial at two other locations in the fourth quarter of this year, according to Eslambochi.
The experimental licenses from the U.S. Federal Communications Commission for the AT&T trials covers 2GHz to 3GHz.
AT&T launched its first commercial 3G service in the U.S. last year.
You gotta love DoCoMo. Not only do they deliver some of the sexiest cell phones on the planet, but now they've achieved 100Mbps download speeds in lab tests. Moreover, these speeds were reached when the receiver was both stationary and moving up to 30km/sec. I'm not going to bore you with the technical layout that accomplished this feat; let's talk about what that will mean in everyday terms.
Downloading a 30 minute television program--of course this would be a lawful download--using your home's ADSL or cable connection would take you about 36 minutes (200Mb file at 768 kilobytes/second). That's just enough time to make some tea and make things comfortable to watch your program.
Using DoCoMo's 4G wireless network, the same program would be downloaded onto your phone before you even had time to get the kettle on the stove: 16 seconds.
This blisteringly fast network won't be commercialized anytime soon. In Japan, the
RF Subsystems for 3G and 4G Wideband Base Stations
13th September ,2004
US : Skyworks Solutions, Inc. (Nasdaq:SWKS), the industry's leading wireless semiconductor company focused on radio frequency (RF) and complete cellular system solutions for mobile communications applications, today announced the availability of the industry's most comprehensive RF subsystems for next generation cellular infrastructure equipment and other wireless transceiver applications. These new solutions -- the first of several wireless conversion transceiver product platforms Skyworks plans to introduce this year -- leverage innovative RF integrated circuit designs that maximize the performance, reliability, cost-efficiency and design simplicity of third and fourth generation (3G and 4G) base station transceivers.
"Skyworks is delivering to base stations and other broadband wireless infrastructure customers proven and advanced technologies honed in the handset market," said Sean Martin, senior director of Infrastructure and Wireless Data at Skyworks. "By providing integrated RF subsystem solutions, we are leveraging our leadership position with technologies such as direct conversion to help base station designers meet the stringent demands of 3G and 4G networks."
With the introduction of its latest DCR(TM) component, the SKY73010 -- a single chip direct quadrature modulator, Skyworks now offers the industry's most extensive direct conversion base transceiver station (BTS) RF subsystem solution. The new modulator complements Skyworks' other industry-leading DCR(TM) BTS building blocks like the direct quadrature demodulator and direct conversion mixer, which are the first products to meet the high linearity requirements of CDMA, WCDMA, GSM, EDGE, TETRA, and 3G base stations. When the new modulator is coupled with the company's other best-in-class products, Skyworks is able to offer DCR(TM) infrastructure subsystem solutions that reduce board size and component count, thereby speeding time-to-market and lowering bill of materials, two capabilities that have been identified by industry analysts as key drivers for recovery of the wireless base station market.
According to a recent IDC study, the base station semiconductor market is now posting healthy growth after several years of sluggishness that resulted from the slowdown in wireless infrastructure spending. IDC reported in June that the market is expected to reach $1.9 billion in 2004 and grow to $2.4 billion by 2008. "Strong OEM-backed standardization activity along with migration to off-the-shelf chip approaches will be the major trends to follow in this segment," said Sean Lavey, program manager at IDC. "We believe further cost reductions delivered at the chip level for key 3G transceiver and power amplifier subsystems will help jumpstart expansion and upgrades to data-enabled cellular networks."
The SKY73010 direct quadrature modulator accepts input frequencies from direct current (DC) to 250 MHz with a broad RF and local oscillator (LO) frequency range of 300 to 2500 MHz. It provides superb broadband noise floor of -155 dBm/Hz, with 35 and 45 dBc carrier and sideband suppression, respectively, at a LO input power of 0 dBm. The SKY73010 is manufactured in a Silicon Germanium BiPolar Complementary Metal Oxide Semiconductor (SiGe BiCMOS) process, and a lead-free 16-pin, 4 x 4 mm, RF land grid array (RFLGA) surface mount package.
Other key building blocks in Skyworks' subsystem platforms include state-of-the-art ultra-linear power amplifier (PA) drivers, high-gain linear PA modules, dual fractional-N synthesizers, and diversity downconverters. Skyworks takes this subsystem approach to the next level by also offering custom module design capabilities, which combine RF/IF receive and transmit functions in single, surface mount, multi-chip modules (MCM), designed to meet specific customer requirements. This design flexibility, combined with in-house manufacturing and test capabilities, reduces time to market and costs associated with more costly ASIC development.
Skyworks' new infrastructure subsystem family is supported and complemented by an extensive portfolio of active and passive discrete components including high-performance switches, LNAs, pin diodes, attenuators, couplers, dielectric resonators and filters. When combined with Skyworks' subsystem solutions, these components offer the unique combination of high linearity, high P1dB, low noise and low cost, and can be used in applications ranging from base stations, satellite transceivers and wireless routers to wireless local loop, industrial/scientific/medical (ISM) band, telemetry, RFID and other global wireless applications.
Improved Bluetooth Protocol Stack for Mobile Phones
9th June , 2004
US : Broadcom Corporation announced integration of its industry-leading Bluetooth protocol stack into its mobile phone software platform.
With the broadest range of Bluetooth application usage profiles available, the new software will enable manufacturers of EDGE / GPRS /GSM cellular phones based on Broadcom chips to easily add the short-range wireless technology to their handset designs.
As the most robust and field proven Bluetooth stack in the industry, the WIDCOMM software -- recently acquired by Broadcom -- will ensure broad interoperability between mobile phones and other Bluetooth devices. The new Broadcom(R) Bluetooth protocol stack features the industry's most extensive range of "usage profiles," software components that define and enable Bluetooth operation among various Bluetooth-enabled devices, such as PCs, printers, headsets, keyboards, PDAs, mobile phones and many others.
The Bluetooth software platform has been fully integrated with Broadcom's cellular protocol stack, drivers, and user interface phone software that ships with the BCM2132 EDGE and BCM2121 GPRS cellular baseband processors. Broadcom supplies cellular chips to Sony Ericsson, Ningbo Bird, and PalmONE and with adoption of Bluetooth in cell phones growing at a rapid pace, Broadcom's new software platform will facilitate the addition of Bluetooth into cellular products with very fast time-to-market and minimal development.
"Broadcom is driving the accelerated adoption of Bluetooth in cellular phones, having established ourselves as leaders in the space with our products targeted at the CDMA phone market several years ago," said Scott Bibaud, Director of Marketing for Broadcom's Bluetooth Products. "Our new integrated software platform extends our leadership into the markets for EDGE/GPRS/GSM phones, allowing handset manufacturers to add Bluetooth into their phones without significantly increasing their internal engineering capabilities or costs. Our integrated protocol stack and reference designs enable new features like music streaming and advanced camera functionality, bringing a much richer Bluetooth experience to the mobile phone."
With the new solution, Broadcom's customers and development partners now require minimal development effort to add Bluetooth to their products, a significant advantage given the handset industry's short product lives and demanding development cycles: only minimal user interface software development is needed to complete their designs.
Mobile phone manufacturers will also benefit from a wide array of products that facilitate fast time to market with entry-level and high-end Bluetooth-enabled phones. Broadcom customers can choose from various software and chip combinations and complete reference designs, enabling either development of modules or the direct placement of Bluetooth functionality on the phone printed circuit board.
Broadcom has consistently demonstrated leadership in the wireless industry as the first company to introduce a Bluetooth-certified single-chip radio, the first to market with a single-chip Wi-Fi(R) product, and the first to market with 4-slot EDGE cellular technology. Broadcom's prompt integration of the recently acquired protocol stack from WIDCOMM partners the company's market- leading EDGE solution with its superior Bluetooth silicon, allowing mobile device manufacturers to more fully expose the benefits of EDGE's high data rates to end-users. Bluetooth-enabled PCs, notebook computers and PDAs can now easily connect to the high-speed "always on" connections provided by EDGE phones based on Broadcom baseband processors.
Industry Leading Bluetooth Software
Usage profiles are specialized applications that provide the vital information required by various peripheral and other devices to interact using Bluetooth. The greater the number of Bluetooth profiles that a handset contains, the greater its ability to communicate with a wider variety of other Bluetooth devices. Broadcom offers the most advanced Bluetooth profile portfolio, including audio profiles such as HSP/HFP for headset and hands-free applications; OPP, FTP and BIP for data and file sharing; audio/video for music streaming; CTP for cordless telephony applications, and many more. With multi-point and multi-profile functionality integrated with the highest performance Bluetooth radio in the industry, Broadcom provides the industry's leading Bluetooth solution for mobile phone handsets on the market today.
4G and the military:
U.S. NAVY AWARDS LOCKHEED MARTIN TEAM $2 BILLION CONTRACT TO BUILD MOBILE USER OBJECTIVE SYSTEM
SUNNYVALE, Calif., September 24, 2004 -- The U.S. Navy today announced that a team led by Lockheed Martin [NYSE: LMT] has won the competition to build the Mobile User Objective System (MUOS), a next-generation narrowband tactical satellite communications system that will provide significantly improved and assured communications for the mobile warfighter.
“Lockheed Martin is proud to be selected as the U.S. Navy's partner for this vital system," said G. Thomas Marsh, executive vice president, Lockheed Martin Space Systems. "Our innovative solution leverages the team's extensive communications satellite experience to help the DoD deliver net-centric capabilities for the U.S. military. We will now focus on building and integrating this innovative, capable, and flexible next-generation tactical communications system on schedule and on cost."
This win enables the team of Lockheed Martin Space Systems, Sunnyvale, Calif.; General Dynamics C4 Systems, Scottsdale, Ariz.; and Boeing Satellite Systems (BSS), El Segundo, Calif.; to produce the first two satellites and associated ground control elements as part of the $2.1 billion Risk Reduction, Design Development, Acquisition and Operations Support contract awarded today by the U.S. Navy Space and Naval Warfare Systems Command (SPAWAR), on behalf of the Program Executive Office – Space Systems, San Diego, Calif. The contract also provides for options on three additional spacecraft. With all options exercised, the contract for up to five satellites has a total potential value of $ 3.26 billion.
MUOS will replace the current narrowband tactical satellite communications system known as the Ultra High Frequency Follow-On (UFO) system. With Lockheed Martin's design, MUOS satellites will be fully compatible with the existing UFO system and associated legacy terminals while dramatically increasing military communications availability and providing simultaneous voice, data and video in real time to mobile warfighters around the globe. MUOS will also maximize the full feature capability of the future Joint Tactical Radio Systems (JTRS) terminals. Anticipated launch date for the first MUOS satellite is planned for 2010.
Lockheed Martin is the prime contractor and systems integrator for the MUOS program, leveraging its 40-plus years of heritage design, development, and operational expertise in providing the most advanced satellite, communications and control systems for Department of Defense (DoD) and global commercial customers. MUOS will feature key technologies currently being fielded in the commercial telecommunications industry and will significantly increase capacity for the mobile warfighter over the next two decades. Lockheed Martin's award-winning A2100 bus will serve as the MUOS spacecraft platform.
General Dynamics C4 Systems will lead the user-entry and integrated ground segments of the MUOS program, supplying a secure ground network, satellite control and network management, and a JTRS-compliant terminal solution. BSS will provide a significant portion of the UHF payload capability. Harris Corporation will supply the MUOS spacecraft antenna. The team also includes Northrop Grumman.
“We are very excited to be given the opportunity to apply our longstanding UHF SATCOM heritage to the end users of MUOS,” said Ron Taylor, a vice president of General Dynamics C4 Systems. “We understand the needs of warfighters on the move and will leverage our experience in network systems to bring them leading edge features in the MUOS infrastructure.”
General Dynamics, headquartered in Falls Church, Virginia, employs approximately 69,400 people worldwide and anticipates 2004 revenues of $19 billion. The company is a market leader in mission-critical information systems and technologies; land and expeditionary combat systems, armaments and munitions; shipbuilding and marine systems; and business aviation. More information about the company can be found on the World Wide Web at www.generaldynamics.com.
“Throughout our 15-year partnership with the U.S. Navy to develop and deploy the UFO satellite fleet, we have witnessed great leadership from the Navy in providing vital global communications services to Armed Forces personnel worldwide,” said Dave Ryan, vice president and general manager of Boeing Satellite Systems. “Through this partnership with Lockheed Martin, we look forward to once again working with our Navy customer to deliver an integrated solution for the MUOS mission.”
Boeing Satellite Systems is the satellite-manufacturing arm of Boeing Integrated Defense Systems. A unit of The Boeing Company, Boeing Integrated Defense Systems is one of the world's largest space and defense businesses. Headquartered in St. Louis, Boeing Integrated Defense Systems is a $27 billion business. It provides network-centric system solutions to its global military, government, and commercial customers.
Headquartered in Bethesda, Md., Lockheed Martin employs about 130,000 people worldwide and is principally engaged in the research, design, development, manufacture and integration of advanced technology systems, products and services. The corporation reported 2004 sales of $35.5 billion.
By Betsy Harter
Jul 15, 2001 12:00 PM
E-mail this article
Although carriers are reluctant to discuss 4G, vendors are mapping its future.
It's still a decade away, but 4G already is a big topic of discussion behind closed doors. But what is 4G, exactly, and why is it necessary to think about it today?
Most vendors already have a position on 4G as they prepare to push their visions in front of standards bodies and generate interest among carriers. But each manufacturer has a different definition of 4G. Whether it's spectrum optimization, network capacity or faster data rates, vendors already are dreaming up ways for carriers to spend money and spectrum decades down the road.
Carriers, however, are reluctant to discuss 4G, either because they refuse to take a public position on it when 3G roll-outs still are unfulfilled, or because they are in denial. But carriers soon will find that 4G is not going away.
Never Too Early
3G was supposed to be the land of wireless milk and honey for carriers, enabling multimedia, data transfer between wireless phones at lightning speeds and m-commerce. So why is 4G even necessary? Al Javed, Nortel Networks (www.nortel.com) wireless CTO, said that although it is true that 3G will bring transactional services, they will be lower-speed services compared to, say, streaming video.
“That is not to say you won't love these lower-speed transactional services like location-based services, wireless shopping, personal services or e-mail. But when we move to streaming video and audio, we need 4G systems,” he said.
But Hakan Eriksson, Ericsson (www.ericsson.com) vice president of research, said 3G will offer some streaming services. For instance, a consumer subscribing to a service that provides video of her favorite sports team scoring a goal, basket or touchdown would not require high-speed video transmission.
“That will take 15 to 20 seconds at a bit rate of 128kb/s,” he said. “When you talk about 4G, applications will require higher speeds and capacity.”
One such application might be a set of eyeglasses that projects information about a person's location so that only that person can see it. For instance, someone at a museum could see facts about a painting projected onto a wall, or a tourist could see road signs that locals would not see.
An anonymous spokesperson at Siemens (www.usa.siemens.com) — a company that calls future wireless systems “beyond 3G” because it is too difficult to clearly differentiate between 3G, 4G and xG features — said systems beyond 3G are important for several reasons. First, the ever-increasing demand for more powerful and seamless applications drives the need for higher data rates. Next, high-quality streaming video — an essential component of future multimedia-based services — will require higher data rates than 3G will provide.
“Also, providing seamless mobile access to services without regard to the physical network used, from the customer's point of view, is another essential requirement,” the source said. Moreover, standards typically take 10 years to move from conception to commercial products, so now is the time to start thinking about solutions beyond 3G if carriers want a powerful generation of seamless fixed and mobile services in 2010 to 2015.
Reinaldo Valenzuela, Bell Labs (www.bell-labs.com) director for wireless communications research, said that most standards start in Europe with pre-competitive cooperative projects funded by the European Commission under the scientific and technical umbrella of ETSI. European (www.etsi.org) standards bodies already have begun the pre-competitive process for some aspects of 4G.
4G According to Vendors
No two vendors have the same 4G vision but all agree that post-3G systems will be a conglomeration of previous wireless technologies rather than a whole new wireless system.
“This time, the route to 4G will not be a system or a standard,” Eriksson said. “It will be a combination of technologies, building on 3G but capable of much higher speeds.”
4G is synonymous with speeds of 50Mb/s to 100Mb/s. Although wireless LANs are slated to reach these speeds, wireless LANs alone will not comprise 4G because their coverage is limited, he explained.
For Nortel Networks and Siemens, 4G is all about access.
“In terms of networking technologies, we are all moving toward IP, including wireless,” Javed said. “The networks will not change; 4G is a change in access technology.”
The source at Siemens said future wireless networks will be more than just a new radio interface; instead, one goal is to increase capacity to provide new multimedia services with higher data rates for mobile applications or, in some cases, for portability without wire-based plug-ins.
“Another important feature will be seamless mobile access to different networks: access anywhere, anytime,” the source said. A system “beyond 3G” comprises a combination of several optimized-access systems into a common IP-based medium access and core network platform. These different access systems will interwork by horizontal and vertical handover, service negotiation and global roaming to provide globally optimized seamless services to users.
“The borders between 3G and 4G are not that clear,” said the source. “The term 4G seems to skip the need for a constant development of services, applications and the network technology. This is why we think and develop ‘beyond 3G.’”
Soon, we will cease to inform on 4G technology on this website. The choice will be up to each of you to decide. The current wireless protocols are not what they should be. Your phone bills are not what they should be, your wireless service is not what it should be. We only will offer you a choice to make based on the FACTS! 4G technology is coming very soon and it will not be controlled by the ones that offer the service that we all have become used to. 4G is the truly next solution for the consumer at large.
There are three kinds of people in this world:
· Those that make things happen.
· Those who watch things happen.
· And those who wonder "what happened?"
We all agree that the wireless industry and the new economy are making things happen, but let's at least try to be among those watching.
3G means "third generation." The first generation in wireless was analog. The second generation is digital. The third generation in wireless will be a deliberate migration to faster, data-centric wireless networks. The U.S. is trying to get to 3G in three or four years, and meanwhile we are being introduced to 2.5G systems that allow cell phones to surf the web in a very limited way.
What's important to cities and counties is that 3G requires both new handsets and new equipment at personal wireless service facilities. Further, personal wireless service facility sites will need to be closer together to handle all these new data, so there will be an increase in deployment.
Kit Spring, a financial analyst for Morgan Stanley Dean Witter says "that 3G Networks may require up to three times the amount of sites of existing 2G Networks." Assuming there's 100,000 sites today, that would be an additional 300,000 sites for a total of 400,000 sites.
Stephen Clark, CEO of SpectraSite, a tower builder and manager, estimates "600,000 new cell sites will be needed by year 2008."
Notice that both gentlemen use the term "sites" rather than "towers." That's because the technology is getting smaller and smaller. 3G sites in Finland and Japan are the size of residential mailboxes and are affixed to utility poles.
But the real story is that handsets will not be getting that much smaller for 3G because they will have more software inside. In fact, as handsets morph into PDAs (Personal Digital Assistants), the appliance we carry around will become increasingly like a computer. So some of the software that normally goes into a cell site is being transplanted into the handset.
This brings us to "what is 4G," which is like predicting the average human life span in the year 2050. More and more futurists are thinking that the cell site will eventually reside in the handset. That's right: "towers" become as ubiquitous as handsets which will be on 24/7 (all the time).
One technology already in development by Mesh Networks has grown out of Department of Defense applications. In this military application, where "towers" cannot be assumed, each soldier's handset acts as the tower. This commercial application, called ArachNet, will still need base stations. However, much fewer points will be necessary than in today's 2G networks and even less than in the proposed conventional 3G networks.
Another concept called VDMA for "Virtual Division Multiple Access," is in the development stage in San Francisco. The handsets could transmit as far as ten miles, but don't have to be turned on to transfer the signal. All that each handset needs to use is a neighboring handset's battery and, off we go, hitchhiking around the continent as long as there's another handset within ten miles. The company seeking $5 million for further development is World Wide Wireless Communications Inc and is based in, where else: San Francisco.
The next generation gap
By Jason Ankeny
Mar 14, 2005 12:00 AM
How Is VoIP Reshaping the Industry and Your Business Strategy?
Telephony & Wireless Review editors along with Infonetics Research explore VoIP technology and business strategies. View Webcast Now.
Sponsored by Global Knowledge
With 3G finally beginning to live up to its hype, it may seem a bit silly to even begin discussion of the U.S. wireless industry's evolution to 4G. After all, while the major distinction of 4G over 3G is the former's accelerated data transmission rates, many of the services promised by 4G — mobile multimedia and streaming video among them — are already available via 3G, albeit in more primitive formats, and consumers aren't exactly standing in line to sign up.
But as they say, you can't stop progress. According to Visant Strategies' recent study “The Road to 4G and NGN: Wireless IP Migration Paths,” the 4G market will begin taking off in the next 12 to 24 months, totaling 113 million global users by 2010. And while neither operators nor their subscribers may be ready and willing to make the leap to 4G, a clutch of U.S. mobile broadband vendors is waiting in the wings, fine-tuning their portfolios with deployments and trials both at home and abroad, most notably in an Asian wireless market well advanced of its western counterpart.
But there's a catch: Seemingly no one agrees on what 4G represents, when it's coming or whether such a thing even exists. Here are four vendors on the subject of 4G.
ArrayComm: “I've heard the term ‘4G’ used in a variety of contexts,” said Mark Goldburg, chief technical officer with personal wireless services developer ArrayComm. “There are all kinds of standards groups working toward some next generation of cellular technology, so it might be that. Alternatively, it might be a service vision. But the 4G service vision hasn't changed much from that of the original 3G service vision that got people so excited and created the expensive spectrum auctions and resulted in the allocation of more spectrum here in the U.S. and elsewhere.”
Founded in 1992 by Martin Cooper, inventor of the first portable cellular phone, ArrayComm is best known for developing iBurst, a broadband wireless system promising 1 Mb/s data rates. Powered by the company's IntelliCell smart antenna technology, which adds a unique spatial metric to each transmission that effectively reuses the same channel multiple times over, iBurst delivers 400 times greater capacity than current wide-area wireless systems.
ArrayComm's technology is presently deployed in more than 225,000 base stations, serving more than 15 million people across four continents. Goldburg stresses that despite its proprietary stake in the iBurst protocol, ArrayComm is in fact standards-agnostic, and its smart antenna technology will work over any network, be it 802.16e, 802.20 or 3G.
“People say, ‘You're in the iBurst business, and you're in the .16 business — aren't you competing with yourselves?’ But both solutions are targeted at different market segments,” he said.
In fact, ArrayComm's success or failure will hinge most of all on the adoption of smart antennas.
“Broad acceptance of smart antenna technology is what's necessary to differentiate WiMAX from W-CDMA,” Goldburg said. “WiMAX is much more amenable to smart antenna processing, and that's where people think they'll get big gains in economics and performance over 3G. The market has to recognize that smart antennas are necessary for cost-effective mobile broadband.”
Flarion Technologies: “There is no such thing as 4G — there's just what's beyond 3G,” said Ronny Haraldsvik, vice president of marketing and global communications for Flarion Technologies. “A move from circuit-switched to packet-switched is what this is all about — we're unwiring what already exists. You're going to see wireline Ethernet become wireless Ethernet. The business case is lower cost of network operations, improved capacity and speed, and access to new revenue sources.”
Flarion is synonymous with Flash-OFDM (orthogonal frequency division multiplexing), an IP-based mobile broadband standard that claims speeds of up to 1Mb/s. In February, the company released upgraded Flash-OFDM gear targeting triple-play IP services over mobile networks, debuting a new platform dubbed Flexband that promises a full 1 GB of data per subscriber each month.
“We have a system that can deliver one gigabyte of data consumption per user per month at a cost to the operator of $10 or less,” Haraldsvik said. “That means an operator can sell mobile broadband and IPTV and VoIP anywhere from $40 and up, and at less than two percent market penetration, you're breaking even with our network.”
Although Flarion's fortunes suffered a blow last month when Nextel announced it would end its Flash-OFDM trial in Raleigh, N.C., this summer, OFDM may still have the inside track on market dominance. According to “Hard Numbers and Experts' Insights on Migration to 4G Wireless Technology,” a recent report issued by Rysavy Research and Datacomm Research Co., the advantages of OFDM become more pronounced as networks achieve higher speeds. The study adds, however, that because CDMA networks — whether CDMA2000 or wideband CDMA — still have some distance to evolve, it will be several years before vendors must shift to OFDM.
It is a known fact that there are numerous telecom companies that dominate the entire telecom industry and have direct control over the existing infrastructure that is in use worldwide with a few exceptions. Some countries have state regulated or owned telecom services that controls the selected markets in their respective countries which prevents outside telecom providers to fairly compete for a reasonable market share of telecom users. There is also the case of restrictive annual spendable revenue that may prohibit potential consumers to benefit from the overwhelming boom in wireless communications or in short the entire broadband arena which includes global internet access.
With the advancements in the technologies used by each industry to deliver their services and ongoing change in the regulatory framework, each is now able to provide the others core services thereby offering full range of services to consumers on a competitive basis.
There are many telecom companies competing for frequency allocations and is increasingly difficult to aggressively plan future installs of new equipment of network upgrades due to the rising costs of upkeep and upgrade of equipment that was not designed to be effectively upgraded. It means excessive patchwork and fix a rounds to salvage and reuse existing communications equipment. This is true with microwave and other actively used equipment that has a 24 hour 7 day a week demand to process and route incoming and outgoing signals of various frequencies. This is also true of cable and fiber lines that are consistently used within the network. You also have various international regulatory agencies and demands thru agreements, license rights, territory rights, and other transparent obstacles that are seen and unseen that in the end bears a burden on the bottom line in cost and maintenance of the entire global network rather it be local, regional, or global. Regardless of media by cable, satellite or wireless or all three, there are tremendous costs involved because of poor planning in the inception stage some 30 years ago, by failing to recognize the dynamics of engineering obsolesce. The end result is price increase due to costly upgrades, fees and other unforeseen expenses. The one that bears the burden is the ratepayer who uses the various services for private or commercial uses. Additionally, you have the federal or state regulators who require a fee for service within their jurisdiction(s). All of this can be avoided with careful planning and execution of the technology at hand.
There are various reports that can be found on the Internet or through organizations that will provide data on the various worldwide telecom industries, which includes service providers, and suppliers. There are separate charts displaying various aggressive forecasts of profits and losses each having its own unique identifier depending on technology offered, services offered, customer penetration or market share, new customers, and consolidating of competition by effectively driving the competitor to either sell off and be bought or be dismantled and sold to pay off debt and take over its customer base by becoming the dominate aggressor in the telecom business arena.
The bottom line is that billions of dollars of revenue are available to the one or more telecom companies that can provide fast, uninterruptible user friendly service with as many service features available at a cost that will interest the ratepayer to take one or all without being overwhelmed with rising prices and costs which eventually causes negative results by promoting switching service, bad debt thru collections, negative public opinion and stringent regulations that restricts forward momentum in profit margins.
The consumer is the driving force behind all markets and profits no matter what the product or service is. There is a measurable amount of revenue generated from exchange of goods, technology and services between companies and other users but the end result is the consumer is the end result that makes or breaks a company. New technologies have always been viewed with skepticism and uncertainty because of unforeseen obstacles or from the lack of confidence from the supplier of the goods or service. The end result is for companies with product to be able to not only convince financial underwriters but legislatures and regulators at all levels that the product or service is viable, has technical merit and can guarantee a fair price and return on investment with a good promise of linear growth towards adding more profits to the bottom line.
The current telecom market is now saturated with regulations, technology peaks, consumer confusion and unsatisfied, low returns, high costs, which includes to the ratepayer, surcharge taxes form all sides, increased competition, frequency bands controlled by federal agencies with no new relief coupled by rising costs in ownership, consolidation of companies into takeovers, thinned out networks and numerous technology fixes that ad more equipment to the field, lack of use of existing fiber rings, other transmission hardware outdated such as copper and failing and increasing costs of satellites and launch capabilities. Telecom companies offering service that is not geared to spec. which means that most services offered are slightly increased in performance but not without a high price. Current wireless systems are now over stated and require constant retrofits. Speeds and reliability of many services are not what the consumer demands or even promised by the telecom industry that hides behind numbers, poorly designed technology and dates that extend new developments out by many years which seems to be the norm within the industry which in reality sends mixed messages to investors and consumers who are relying on honest assessments of the technology at hand.
48. Magnetic reflection. R. B. ABBOTT, Purdue University.--Experiments suggested by Ross (Proc. A.I.E.E., June, 1920) on "Magnetic Reflection" were carried out on a larger scale. A strong magnetic field was produced by a 500-cycle current passing through a large electromagnet. A large plane coil of wire connected to a resistance coupled amplifier with a telephone receiver, was placed along the axis of the magnet and at some distance from it. The coil was oriented in the magnetic field so that no currents were induced in it. A plane thin sheet of zinc about four square feet in area was then placed at some point in the field and rotated around a vertical axis. As the plate was rotated, a 500-cycle note was produced in the telephone and went through a cycle of intensities corresponding to the angles of the zinc plate. The results show that the zinc plate, due to eddy currents, acted very much like a mirror, the angles of incidence and reflection being about equal. These results obtained when the zinc plate was twenty feet and more from the magnet and detecting coil.
[Abbott, R. B. Physical Review, February, 1928, 313]
There has been much speculation as to the validity of Gaiacomm's claim of having developed a 4G global wireless communications technology based on terahertz frequency and low frequency combinations.
We have discovered uses for the Terahertz band in the artful form of wireless communications.
Scientists around the world have been actively researching and developing equipment for the use of terahertz waves, infrasound. A form of signal transduction except not at the cellular level has been re-discovered and developed.
We have discovered a way to use the dynamics of the earth's magnetic field and the surface area of the planet to rebroadcast a signal globally and with little or no loss.
We have discovered a way to use Sonar to detect and identify other objects in the oceans without impeding on the communications channels of whales and other mammals that inhabit the deep oceans.
We have discovered a method to communicate to submarines at any depth or location.
We have discovered a method to use the dormant fiber optic rings that exist worldwide to utilize our wireless broadcast system.
We have discovered a method to isolate and manipulate the ionosphere in such a way as to control the dynamics of the electrons that exist and cause isolated fusion reactions within selected regions of the earth's atmosphere, (Compton Effect).
We have discovered a method to eliminate the dependency on satellites that are too costly to maintain and "clutter" the skies with space junk.
We have discovered a method to effectively broadcast a signal to any location on planet earth to digital devices (cell phones, handheld PDA's), computers, and other frequency specific devices. A wireless network that is superior to GSM, WiFi or any other wireless protocol.
We have discovered a method to construct an isotropic antenna thus allowing for a 360-degree signal footprint. The projected radiated signal footprint is 5 million sq. surface miles per antenna.
We have discovered a method to use the earth as a "transponder" and take advantage of the spherical wave-guide that exists globally.
We have discovered a method to keep control of the cost of operation down thus passing a significant savings to the ratepayer far cheaper than any telecommunications provider can offer worldwide. In addition, we have designed a pricing structure that will allow global customers to participate with our service and have money to spare.
This new wireless 4G network will only ad choices to the consumer and not directly challenge the other telecom networks. It’s about choices not competition.
I ask for your patience on this new technology and ask that you all keep an open mind.
Dr. Judah Ben-Hur
I am finished. I hope you all can learn to think outside the box and accept change...because change is coming to the telecommunication industry one way or another! Good luck to you all!
How it all started:
Nicholas Constantine Christofilos (Νικόλαος Χριστοφίλου) (16.12.1916 – 24.9.1972) Greek-American physicist. Similar as Nikola Tesla he was an amazing personality. He was born in Boston, USA and raised in Greece. Christofilos was working for an Athens elevator company when he became interested in high-energy particle physics. He worked on large scale projects mainly for military purposes. His strong focusing principle that was found by others later independently reduces the dimension and costs of accelerators necessary to achieve beams of a given energy. With this principle more energetic beams allowed to increase our knowledge of the fundamental constituents of the world. Other ideas of Christofilos that have been realized are antennas of almost continental dimensions using millions of Watts to produce extreme low frequency waves for submarine communication, or the generation of Van Allen Belt like artificial radiation belts formed by explosions of hydrogen nuclear bombs in the upper atmosphere, that also can produce electromagnetic short intense pulses able to destroy all electronic devices over a very large area. The radiation could destroy Soviet satellites in orbit and disturb the majority of military communication carried over HF and VHF radio frequency bands.
Greek Newspaper that claims that the autodidact nuclear scientist Christofilos found a method to transform matter into energy. The Journal Life calls Christofilos „The Crazy Greek“ for his ideas. (William Trombley, “Triumph is Space for a 'Crazy Greek'”, Life (March 30, 1959), pp. 31-34.)
Bioseismic studies have previously documented the use of seismic stimuli as a method of communication in arthropods and small mammals. Seismic signals are used to communicate intraspecifically in many capacities such as mate finding, spacing, warning, resource assessing, and in group cohesion. Seismic signals are also used in interspecific mutualism and as a deterrent to predators. Although bioseismics is a significant mode of communication that is well documented for relatively small vertebrates, the potential for seismic communication has been all but ignored in large mammals. In this paper, we describe two modes of producing seismic waves with the potential for long distance transmission: 1) locomotion by animals causing percussion on the ground and 2) acoustic, seismic-evoking sounds that couple with the ground. We present recordings of several mammals, including lions, rhinoceroses, and elephants, showing that they generate similar acoustic and seismic vibrations. These large animals that produce high amplitude vocalizations are the most likely to produce seismic vibrations that propagate long distances. The elephant seems to be the most likely candidate to engage in long distance seismic communication due to its size and its high amplitude, low frequency, relatively monotonic vocalizations that propagate in the ground and have the potential to travel long distances. We review particular anatomical features of the elephant that would facilitate the detection of seismic waves. We also assess low frequency sounds in the environment such as thunder and the likelihood of seismic transmission. In addition, we present the potential role of seismic stimuli in human communication as well as the impact of modern anthropogenic effects on the seismic environment.
Processor to be used for 4G system:
STI cell processor
Next generation processors
Just as the cells in a body unite to form complete physical systems, a "Cell" architecture will allow all kinds of electronic devices (from consumer products to supercomputers) to work together, signaling a new era in Internet entertainment, communications and collaboration.
Breakthrough microprocessor architecture that puts broadband communications right on the chip.
Markets: · Next-generation communications
· Consumer multimedia applications
STI cell processor defined
Two years ago, Sony and Toshiba and IBM (STI) announced that they had teamed up to design an architecture for what is termed a system-on-a-chip (SoC) design. Code-named Cell, chips based on the architecture will be able to use ultra high-speed broadband connectivity to interoperate with one another as one complete system, similar to the way neural cells interoperate over the brain's network.
Market demand for STI cell processor
IBM expects Cell to define an entirely new way of operating. Cell's underlying architecture will enable it to manifest itself into many forms for many purposes, helping to open up a whole new set of applications. Incorporating this architecture, chips will be developed for everything from handheld devices to mainframe computers.
IBM strategy with STI cell processor
IBM has an unmatched history and capability of building custom chips and believes the one-size-fits-all model of the PC does not apply in the embedded space; embedded applications will require a flexible architecture, like Cell. Cell also brings together, for the first time, many leading-edge IBM chip technologies and circuit designs developed for its servers.
STI cell processor benefits
Cell will take advantage of IBM's most advanced semiconductor development and process technologies. These cells will deliver high performance while consuming small quantities of power.
The same processor that is being put into the new PS3.
Hmmm, massive clusters of PlayStations running the new 4G networks.
On a side note, Sony is claiming 2 TFlops of power in one PS3. That means just 35 of them in a cluster would be more powerful than IBM's BlueGene. Worlds most powerful supercomputer for under $16,000.
Good points, but we will Gleen from the Cell technology to use it to process the billions of signals that will be bouncing around in some order.
In order to understand the governing dynamics of this technology, begin to understand Signal Transduction and Infrasonics and gleen from that. Its all in Nature not in our heads, we must be able to transform from Nature to Physics to technology and back...
Inform and educate to break the bounds of ignorance!
The ionosphere is a layer of ionized gas particles surrounding the Earth in the height between 40 and 100 km. The ionization is caused by the solar ultraviolet and particle-radiations and by the meteorites. Since the main source of the ionization is the Sun, the state of the ionosphere depends primarily on solar activity. Ionospherical layers are characterized by the number of free electrons in the unit of volume. On the base of local maxima of electron density, D, E and F layers are distinguished. At day-time, the layer F splits into layers F1 and F2 while at night-time only D and F layers show up.
Since the refractive indices of these ionospherical layers are different, radio waves reflect from the layers. A maximum frequency called the critical frequency belongs to each layer. This is defined so that the waves with higher frequencies than the critical frequency will reflect with a probability less than 50 per cent. If the waves reach the layer askew then signals with frequency higher than the critical fc are also reflected. The relation between the maximum usable frequency (MUF) and the critical frequency is then determined by the skew angle y as MUF = fc/cos y.
Search for data: http://www.scicentral.com/
Invention and Discovery: Atomic Bombs and Fission
Last changed April 1997
Leo Szilard and the Invention of the Atomic Bomb
It would be logical to assume that the discovery of fission preceded the invention of the atomic bomb. It would be normal also to expect that no single individual could really claim to be "the inventor", since the possibility sprang naturally from a physical process, and required the efforts of many thousands to bring it into existence. Many descriptions of the origin of atomic bombs can be found that logically and normally say exactly these things.
But they are not correct.
The idea of "invention" does not usually require the physical realization of the invented thing. This fact is clearly recognized by patent law, which does not require a working model in order to award a patent. It is common for inventions to require additional discoveries and developments before the actual thing can be made. In these cases, an invention may fairly have more than one inventor - the originator of the principle idea, and the individual who actually made the first workable model.
In the case of the atomic bomb there is clearly one man who is the originator of the idea. He was also the instigator of the project that led ultimately to the successful construction of the atomic bomb, and was a principal investigator in the early R&D both before and after the founding of the atomic bomb project - making a number of the key discoveries himself. By any normal standard this man is the inventor of the atomic bomb.
This man is Leo Szilard.
On September 12, 1932, within seven months of the discovery of the neutron, and more than six years before the discovery of fission, Leo Szilard conceived of the possibility of a controlled release of atomic power through a multiplying neutron chain reaction, and also realized that if such a reaction could be found, then a bomb could be built using it.
On July 4, 1934 Leo Szilard filed a patent application for the atomic bomb In his application, Szilard described not only the basic concept of using neutron induced chain reactions to create explosions, but also the key concept of the critical mass. The patent was awarded to him - making Leo Szilard the legally recognized inventor of the atomic bomb.
Szilard did not patent this prescient and tremendously important idea for personal gain. His motive was to protect the idea to prevent its harmful use, for he immediately attempted to turn the idea over to the British government for free so that it could be classified and protected under British secrecy laws.
On October 8, 1935 the British War Office rejected Szilard's offer, but a few months later in February 1936 he succeeded in getting the British Admiralty to accept the gift. Szilard's actions in attempting to restrict the availability of the atomic bomb, are also the earliest case of nuclear arms control. Later, when the possibility of a German atomic bomb had been shown to be nonexistent, Szilard campaigned vigorously against the use of the bomb, and went on to help found The Bulletin of Atomic Scientists and The Council for a Livable World.
The Discovery of Fission
With the discovery of the neutron by James Chadwick in February 1932 a scientific gold rush ensued to discover what effects would be produced by bombarding different materials with this new particle. Over the next several years, teams of researchers in several countries (especially one headed by Enrico Fermi in Rome) bombarded every known element with neutrons and recorded scores, even hundreds, of new radioactive isotopes.
On May 10, 1934 Fermi's research group published a report on experiments with neutron bombardment of uranium. This was the first such investigation to be reported on. Several radioactive products are detected, but positive identifications were not made. Interpreting the results of neutron bombardment of uranium became known as the "Uranium Problem" since the large number of different radioactivities produced defied rational explanation. The dominant theory was that a number of transuranic elements never before seen were being produced, but the chemical behavior as well as the nuclear behavior of these substances were unexpected and confusing.
The first statement of the correct resolution of the Uranium Problem was published by German chemist Ida Noddack in September. Her letter in _Zeitshrift fur Angewandte Chemie_ argued that the anomalous radioactivities produced by neutron bombardment of uranium may be due to the atom splitting into smaller pieces. No notice of this suggestion was taken.
Fermi discovered the extremely important principle of neutron behavior called "moderation" on October 22, 1934. Moderation is the phenomenon of enhanced capture of low energy neutrons, as when they are slowed down by repeated collisions with light atoms.
December 1935, Chadwick won the Nobel Prize for discovery of the neutron.
In November-December 1938, the Otto Hahn and Lise Meitner correctly unravel the Uranium Problem. Hahn determines conclusively that one of the mysterious radioactivities is a previously known isotope of barium. Working with Meitner, they develop a theoretical interpretation of this demonstrated fact. On December 21, 1938 Hahn submits a paper to _Naturwissenschaften_ showing conclusive evidence of the production of radioactive barium from neutron irradiated uranium, i.e. evidence of fission.
In the first few weeks of January, word of the discovery traveled quickly in Europe.
January 13, 1939 - Otto Frisch observed fission directly by detecting fission fragments in an ionization chamber. With the assistance of William Arnold, he coins the term "fission".
By mid January Szilard heard about the discovery of fission from Eugene Wigner, and immediately realized that the fission fragments, due to their lower atomic weights, would have excess neutrons which must be shed. The multiplying neutron chain reaction that he had postulated had finally been discovered.
January 26, 1939 - Niels Bohr publicly announces the discovery of fission at an annual theoretical physics conference at George Washington University in Washington, DC. This announcement was the principal revelation of fission in the United States.
January 29, 1939 - Robert Oppenheimer hears about the discovery of fission, within a few minutes he realized that excess neutrons must be emitted, and that it might be possible to build a bomb.
February 5, 1939 - Niels Bohr gained a crucial insight into the principles of fission - that U-235 and U-238 must have different fission properties, that U-238 could be fissioned by fast neutrons but not slow ones, and that U-235 accounted for observed slow fission in uranium.
At this point there were too many uncertainties about fission to see clearly whether or how self-sustaining chain reactions could arise. Key uncertainties were:
The number of neutrons emitted per fission, and
The cross sections for fission and absorption at different energies for the uranium isotopes.
For a chain reaction there would need to be both a sufficient excess of neutrons produced, and the ratio between fission to absorption averaged over the neutron energies present would need to be sufficiently large.
The different properties of U-235 and U-238 were essential to understand in determining the feasibility of an atomic bomb, or of any atomic power at all. The only uranium available for study was the isotope mixture of natural uranium, in which U-235 comprised only 0.71%.
March, 1939 - Fermi and Herbert Anderson determine that there are about two neutrons produced for every one consumed in fission.
June, 1939 - Fermi and Szilard submit a paper to _Physical Review_ describing sub-critical neutron multiplication in a lattice of uranium oxide in water, but it is clear that natural uranium and water cannot make a self-sustaining reaction. This paper is the first experimental evidence of neutron multiplication.
July 3, 1939 - Szilard writes to Fermi describing the idea of using a uranium lattice in carbon (graphite) to create a chain reaction. This is the first proposal of the graphite moderated reactor concept.
August 31, 1939 - Bohr and John A. Wheeler publish a theoretical analysis of fission. This theory implies U-235 is more fissile than U-238, and that the undiscovered element 94-239 is also very fissile. These implications are not immediately recognized.
September 1, 1939 - Germany invades Poland, beginning World War 2.
207 replies on this Thread, 184 from the same poster. Why are we giving a Soapbox to one Poster talking to himself??
We don't need to post the contents of a science encyclopedia on the WA forums do we?
Seems more like Blog material than for WA Forums.
4G Technology and the near future
4G technology will allow users to transmit and recieve information at incredible speeds in the future. It will allow the internet to be properly accessed via mobile phones due to increased bandwidths.
4G, or fourth generation technology, will increase data transmission rates (up to 200 times faster than 2G at 20Mbit/sec). 3G data rates are currently 2Mbit/sec, which is quite fast compared to 2G's: 9.6Kbit/sec. 4G builds on the 3G standard, although it integrates and unifies the different interfaces i.e. CDMA, EDGE, etc. The introduction of 3G technology provided a huge expansion in mobile capacity and bandwidth, and 4G will do the same for the spectrum of broadband communications.
It is said that 4G technologies will give way to 3-D virtual reality and interactive video/ hologram images. 4G will increase interactions between corroborating technologies, so that the smart card in your telephone will automatically pay for goods at a grocery store or will tell your car to warm up in the morning, because your phone has noted you leaving the house.
4G is expected to provide "better-than-TV" quality images and video-links, although it is likely that forecasts will change as customer demand develops over time. It is expected that new standards of HTML, Java, HTTP, GIF etc., will need to be developed for the use in 4G.