Monday, 31 December 2012

WiMAX vs WiFi



Wimax technology is a standard based wireless technology which is used to provide internet access and multimedia services at very high speed to the end user. The Wifi technology is still using local area network (LAN) for the predictable future. Wimax and wifi differences are very simple as below.

  • The basic difference between Wimax technology and Wifi technology are cost, speed, distance and so on. Wimax coverage is about 30 miles and Wifi coverage is very limited to some small area.
  • Wimax network just as an ISP without any cable because Wimax singnal used to get access to internet to your home or business, while Wifi will be used inside in your local area network (LAN) for access to the internet.
  • The Wimax architecture is design to make possible metropolitan area networking (MAN). The base station of Wimax capable to provide access to business and hundred of homes, While Wifi is providing only local area networking (LAN).
  • The deployments of Wimax and Wifi network are same both ISP would have their T3 access. The line of sight antennas used to connect tower in Wimax technology. The tower shared out the non line of sight to MAN.
  • The line of sight antennas for Wimax network operate at 60 MHz frequency while the tower having non - line of sight operate on a range just like the WiFi.
  • The base or tower station of Wimax will beam a signal to receiver of Wimax. Similarly Wifi access point transport signal to the receiving device.
  • Wimax network providing QoS (Quality of Service) therefore a large number of people get access to tower at the same time. The built in algorithm automatically transfer the user to other tower or cell of Wimax station. Unlike Wifi user have to sort of fight to stay on connected with a specified access point.
  • The most significant issue of Wimax and WiFi difference is pricing because Wimax is a high cost network, while Wifi is a low cost network therefore mostly people adopt WiFi network due to less expenditure and avoid Wimax due to expensive installations.
  • WiMax will not put out of place WiFi in the home because WiFi is much better in speed and technology. With the passage of time new improvement brings a new variant in 802.11.Wimax offering high speed but if a client exists away from tower or base station speed could decreases.
  • Wimax offer high speed internet as a broadband access which transfer data, voice, video at very high speed. While WiFi offer short range of data transfer because WiFi can connect only in specified areas so only file sharing may possible.
  • Wimax design for long range distance in licensed spectrum or unlicensed spectrum. Wimax support point to point or point to multipoint connection. Multiple standard of wimax such as 802.16e, 802.16b for mobile connectivity from fixed location. While WiFi offer quality services to fixed Ethernet where packets are precedence on their tag. Hotspots of WiFi are usually backhauled over ADSL in small business, café etc therefore to get access is normally highly challenging. The uploading speed of wifi as compared to Wimax also very low rate among internet and router.
  • Wimax network execute a connection oriented MAC while Wifi runs on the CSMA/CA protocol, which is wireless and strife based.

On the whole Wimax technology becoming popular day by day but WiFi technology has there own useful features. Wimax technology can be predictable to be one of the most extensively used wireless internet access technology in the future.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth






Saturday, 29 December 2012

Wimax Technology Background - WiMAX History

 

Wimax Technology Background - WiMAX History
   
WiMAX Technology Background, WiMAX History

Now the question comes to our attention, what is Wimax technology and how wimax technology works and why has it taken so long to appear in the marketplace? The definition according to Wimax technology forum, an organization dedicated to promoting Wimax technology and specifications, According to WiMAX forum "Wimax Technology is a standards-based technology enabling the delivery of last mile wireless broadband access as an alternative to cable and DSL. Wimax Technology will provide fixed, nomadic, portable, and, eventually, mobile wireless broadband connectivity without the need for direct line-of-sight to a base station. In a typical cell radius deployment of 3 to 10 kilometres, Wimax Forum Certified systems can be expected to deliver capacity of up to 40 Mbps per channel, for fixed and portable access applications." (WiMax Forum, 2007)

In the Wimax terminology, portable access means you can access Wimax networks from different locations, but not necessarily while moving. Mobile Wimax is accessible while on the move.

From this definition, we can see why Wimax Technology has been such a trendy topic in the wireless industry. The capability to provide multiple channels at 40 Mbps for cell sizes from 3 to 10 kilometres is well above what can be accomplished with standard 801.llg or 802.lla wireless point-to-point. In addition, non-line-of-sight (NLOS) features of some Wimax technology provide the possibility for better coverage in wooded or congested areas.

A Wimax technology network consists of two major components: a wimax base station and a subscriber station (wimax cpe). Wimax base stations provide connectivity to one or more subscriber stations and are implemented by service providers to provide Internet, voice, video or Wireless Area Network (WAN) link access.

These base stations are similar to Wi-Fi Access Points (APs) as they provide centralized access to back end connected networks. They use different standards than Wi-Fi, however, so the evaluation ends there. While subscriber stations are uniquely designed for 802.16 networks, the reality is that they provide a connection to the network and you can still route internal 802.11 devices through the 802.16 subscriber station for network access.

Why HTML5 provided more tricks than treats in 2012

Why HTML5 provided more tricks than treats in 2012

 The stage was set with an expected one billion HTML5 phones sold by 2013. Facebook was ready to pave the way. I could repeat many other reasons why HTML5 should have taken off in 2012, but as we’ve seen over the last year, it just didn’t. Mark Zuckerberg said it best, “The biggest mistake we did as a company was bet too much on HTML5.”
Here’s an explanation for why HTML5 did not meet the high expectations set last year.

1. Cross-Platform HTML5 Development Hasn’t Taken Off
There is a massive split between desktop and mobile HTML5. Just because the technology exists across desktop and mobile, doesn’t mean the design issues have changed:
1) Keyboard compared to keypad
2) Screen size of the platform
3) Mouse compared to touchscreen.

It’s optimal to develop products for a specific platform. This allows developers to personalize the look, feel, and functionality of an app, which is extremely important from a user experience standpoint. The assumption held by many who were looking to HTML5, was that users would access apps across devices, from desktop to mobile. In reality, users will pick the one with the best functionality and naturally gravitate to the platform on which an app works best.
When it comes to mobile, an app has to be developed with the mobile user in mind. Nothing is more frustrating for a developer than devoting time across multiple platforms, only to discover later that your users prefer one device over another. No matter what, developing across multiple platforms takes time, energy, and thoughtfulness.

2. App Stores Deliver Discoverability, HTML5-Only Sites Are Out in the Woods
It’s easy to create a browser link with a homescreen icon for a mobile device, but much harder to change cultural practices. The challenge HTML5 publishers experience is creating an easy and positive experience to access hybrid apps. Mobile users now expect to be told to download an app and, instinctively, users search for apps in stores. Google and Apple dominate these stores and have thus far not made steps towards including HTML5 sites.
Facebook created the most publicized “universal store,” listing both native and sites in HTML5 — some believe as a way to circumvent Apple and Google’s app stores. With the hopes of coaxing them to include HTML5 apps, Facebook assembled a network of developers under the W3C but so far that strategy has not shown traction.

3. Hybrid Apps Can’t Depend on Mobile Browsers
I thought that at least one major console game would be released or re-released using WebGL. It may have happened, but in lieu of the previous point, the big mobile browser players like Chrome and Safari have shown no intention to grow their browsers to fully support HTML5 technologies. For example, WebGL, a central tool for 3D game development has been incompatible with the aforementioned mobile browsers.
Compatibility is one issue, but there’s also speed on the mobile browser. Findings from a study we conducted earlier this year showed that HTML5 running on mobile browsers was ten to seven hundred times slower than when running on a desktop. In fact, on average mobile browsers were 889 times slower. Implicit within this data is that a large percentage of mobile users have a poor experience when accessing web apps that are graphical in nature.

4. Fragmentation, Fragmentation, Fragmentation
Is the name of the game when it comes to hybrid apps. Anyone who has built a website has experienced browser compatibility issues. Double these across platforms and you have a headache. For example, Sean Soria, an engineer for Gamzee described some of the issues they faced building Skyscraper City in a guest post for Facebook’s HTML5 blog.
In the post Soria describes a hack to increase speed on the DOM “is fake 3D transforms on your CSS. That triggers hardware acceleration on most mobile devices, resulting in better performance than Canvas, for example.” This is awesome — except the workaround doesn’t work on Android phones. There are many issues like this, where both problems and solutions are distinct on each device.

5. HTML5 Isn’t Robust Enough
From what I’ve seen, the hype has led to many people overestimating how much developers like using JavaScript. Is JavaScript great for cross platform development? Yes. Do developer prefer it over possible alternatives? Not quite yet. For more complicated apps, especially games, object-oriented and more strongly typed languages are still preferred by developers.
So, HTML5 didn’t pan quite how we thought it was going to. It turned into a scapegoat for Facebook and possibly one of the most overhyped advancements of the mobile generation. If HTML5 truly is the future, than we’re much farther from that future then we thought. That’s not to say that HTML5 won’t get it right some day – just not any time soon.

Friday, 28 December 2012

Worlds Smallest Petrol Engine



The scientists have created the smallest petrol engine in the world (less than a centimeter long not even half an inch),small enough to power a watch or any small gadget.The mini-motor which runs for two years on a single squirt of lighter fuel is set to revolutionize technology associated with it. It generates 700 times more energy than a conventional battery. It could be used to operate laptops and mobile phones for months doing away with the need for charging.Experts believe it could be phasing out batteries in such items within just six years. The engine, minute enough to be balanced on a fingertip, has been produced by engineers at the University of Birmingham. At present, charging an ordinary battery to deliver one unit of energy involves putting 2,000 units into it. The little engine, because energy is produced locally, is far more effective.

One of the main problems faced by engineers who have tried to produce micro motors in the past has been the levels of heat produced. The engines got so hot they burned themselves out and could not be re-used. The Birmingham team overcame this by using heat-resistant materials such as ceramic and silicon carbide, finally.

Top five Cracking Open teardowns of 2012

From smartphones and tablets to a $7,000 desktop, we've cracked open a lot of tech this year. And as is the tradition at the end of each year, it's time to take a look at a few of our favorites. During this special episode of Cracking Open, I'm counting down my top five teardowns of 2012.
First on our list is the Nokia Lumia 900. Released in the spring, this Windows Phone device was Nokia's attempt to recapture some of the American smartphone market.
Unfortunately, as we discovered during our teardown, the phone's hardware just wasn't up to par with the competition. I went so far as to call it mediocre. So why did I include our 900 teardown on this list? Because it's probably this year's best example of why knowing what's inside a device is an important factor in deciding whether to buy one.
Just two months after releasing the Lumia 900, we learned that it wouldn't be upgradable to Windows Phone 8. And despite huge marketing pushes from Nokia, Microsoft, and AT&T (including a 50 percent price cut three months after launch), the phone never took off. Like the phone's hardware, sales were just mediocre.
In the fourth spot is one of the most expensive items I've ever cracked open -- a $7,000+ HP Z1 Workstation.
The Z1 was unlike any other all-in-one on the market. It was packed with high-end hardware and designed to be both upgrade- and repair-friendly. Its unique stand let the machine lie flat (for easy hardware access) and the case opened more like the hood of a car than a computer. You could remove most of the internal components without using tools. And despite having six fans, it was remarkably quiet.
HP Z1 Workstation
This machine was definitely one of the most unique, and enjoyable, teardowns of 2012.
Third on our list is the highly anticipated Google Nexus 7. Assembled by Asus, the first Google-designed tablet had solid hardware, a good design, and a great price ($199). As I discovered, it was also a snap to crack open and had easily replaceable hardware.
Even with the release of the Kindle Fire HD, iPad Mini, and Nook HD, CNET tablet reviewer Eric Franklin still believes that overall, the Nexus 7 is the best small tablet you can buy. I agree.
The second spot on our list is held by another highly anticipated tablet -- Microsoft's Surface with Windows RT. Microsoft's first Windows 8 tablet came with a quad-core, Nvidia Tegra 3 processor, 2 gigs of RAM, and several nice features like a microSD card slot, full-size USB port, and kickstand.
Microsoft Surface with Windows RT
(Credit: Bill Detwiler/TechRepublic)
Unfortunately, it was also a pain to crack open and disassemble. As I wrote in my original review, "hopefully, the Surface with Windows Pro, which is aimed at businesses, will be more repair-friendly."
We've reached the end of my top-teardowns list, and sitting in the top spot is Apple's 15-inch MacBook Pro with Retina Display. Unfortunately, like the Lumia 900, this machine is on my list for all the wrong reasons.
Like all MacBook Pros, the Retina version is well-built and has solid hardware, and its display really is gorgeous. But as I discovered during my teardown, it's also nearly impossible to upgrade -- thanks to RAM that's soldered to the motherboard. And, it's a pain to work on, thanks to tamper-resistant pentalobe screws and a battery that's glued to the upper half of the case.
Apple 15" MacBook Pro with Retina display
This proved to be another example of why knowing how a device is put together and what's inside is critical when deciding whether to purchase one. The last thing you want is to find out a year after you bought your Retina MacBook Pro that you can't upgrade the memory.
 
 

Wednesday, 26 December 2012

What is F#?




F# is a typed functional programming language for the .NET Framework. It combines the succinctness, expressivity, and compositionality of typed functional programming with the runtime support, libraries, interoperability, tools and object model of .NET. F# stems from the ML family of languages and has a core language compatible with that of OCaml, though also draws from C# and Haskell. F# was designed from the ground up to be a first-class citizen on .NET, giving smooth interoperability with other .NET languages. For example, C# and F# can call each other directly. This means that F# has immediate access to all the .NET Framework APIs, including, for example, Windows Presentation Foundation and DirectX. Similarly, libraries developed in F# may be used from other .NET languages.

Since F# and OCaml share a similar core language, some OCaml libraries and applications can cross-compile either directly or with minor conditionally-compiled changes. This provides a path to cross-compile and/or port existing OCaml code to .NET, and also allows programmers to transfer skills between these languages. A major focus of the project has been to extend the reach of OCaml-like languages into arenas where they have not traditionally been used. Throughout the project the designers of F# are grateful for the support and encouragement of Xavier Leroy and others in the OCaml community.

F# As a Language:

F# includes support for the foundational features of functional programming including tuples, lists, options, function values, local function definitions, pattern matching and sequence expressions.
      
    The powerful type inference mechanisms of F# allow code to be both succinct and yet fully type-checked.
       
  •  F# also includes support for advanced functional programming constructs such as active patterns and computation expressions. Computation expressions can be used to express data queries and client/server modalities in AJAX-style web programming. They enable programmers to write succinct and robust reactive agents through the use of asynchronous workflows. Computation expressions are related to ``monads'' in Haskell.
      
  •  F# embraces object-oriented programming and includes support for type-inferred, succinct descriptions of object types.
      
  •  F# allows types and values in an F# program to be accessed from other .NET languages in a predictable and friendly way.
      
  •  F# includes support for a form of meta-programming, inspired by LINQ. This allows data queries to be expressed and type-checked in F# code and then dynamically compiled and translated to target languages such as SQL using the LinqToSql framework.
      
  • F# fully supports .NET generics and the language was designed partly with this in mind.
      
    Through .NET, F# supports advanced language and runtime features such as Unicode strings, dynamic linking, preemptive multithreading, and SMP support.
    

F# for Developers:

The F# Interactive environment fsi.exe supports top-level development and exploration of the dynamics of your code and environment.
     
The command line compiler fsc.exe supports separate compilation, debug information and optimization.
     
  • F# comes with F# for Visual Studio, an extension to Visual Studio 2005 that supports features such as an integrated build/debug environment, graphical debugging, interactive syntax highlighting, parsing and typechecking, IntelliSense, CodeSense, MethodTips and a project system.
      
  •     F# can be used with tools from the .NET Framework, Microsoft's Visual Studio and many other .NET development tools.
      
  •     F# comes with an ML compatibility library that approximates the OCaml 3.06 libraries. This means you don't have to use .NET libraries if it is not appropriate.  It is possible to write large and sophisticated applications that can be cross-compiled as OCaml code or F# code, and we take this mode of use seriously.

Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Tuesday, 25 December 2012

Features of WiMAX Technology


Wimax Features

WiMAX is a great invention in wireless technology providing 30 miles broadband access to mobile users. Wimax technology based on IEEE 802.16 standard and it is a telecommunication protocol offering full access to mobile internet across cities and countries with a wide range of devices. Wimax technology has salient features as described below.

Wimax support multipath

Wimax technology is offering OFDM-based physical layer which is based on orthogonal frequency distribution. The Wimax technology is providing confrontation to multipath. Due to its good architecture it allows the user to operate in NLOS condition. Now Wimax is familiar as a technique of multi path for wireless network.

Wimax broadband access

Wimax technology is offering very high speed broadband access to mobile internet. When you are using 20MHz the data rate can be high up to 74Mbps. Generally 10MHz wit the TDD scheme provides 3:1 up and down link ratio. Wimax providing very good signals therefore higher data rate can be achieved with multiple antennas. Antennas are used for beam forming, space time coding etc.

Wimax offer high speed data rate

Another feature of Wimax Technology is high speed data rate. The scalable architecture of physical layer is offering high speed data rate. Wimax technology is providing easily scaling of data with possible bandwidth of channel. If the bandwidth of channels may from 1.25MHz to 10MHz then a system can use 128, 512, 048 bit FFTs which provide dynamically roaming across numerous network having dissimilar bandwidth.

Wimax offer modulation and error correction

The use Wimax technology is increasing rapidly because it supports lots of modulation and error correction facility to user. It also allowing a user that they could change the scheme according to channel condition. (AMC) Adaptive Modulation and Coding is a valuable method to exploit throughput in a varying channel.

Wimax support reliability of data

Automatic retransmission of data supported by Wimax at data link layer for link is a great feature. It is not only improving reliability but also enabled ARQ which necessitate each broadcast packet to be recognized by the receiver, and if any unacknowledged data packets are unspecified to be misplaced and are retransmitted.

Wimax support TDD and FDD

Another significant feature of Wimax technology is that it supports Time Division Duplexing (TDD) and Frequency Division Duplexing (FDD). They both are offering low cost system accomplishment.

Wimax TDM scheme

The wimax technology holds all system therefore any data may be in form of uplink or downlink checked by scheduler from the base station. The total capacity shared between several users according to their demand. And it is done by Wimax TDM (Time Division Multiplexing) scheme.

Wimax MAC layer

The architecture of Wimax technology based on MAC layer which is a connection oriented layer. Through MAC layer a use can perform a variety of functions such as various type of application including multimedia and voice can be used. It also supports best efforts for data traffic as bit, real time, traffic flaws etc. The aim to design Wimax technology is to facilitate large number of user with a variety of connection per terminal.

Wimax strong encryption

Wimax technology also facilitates the user with strong encryption. By using AES (Advanced Encryption standard) a user can get strong privacy and administration. The EAP protocol offer flexible substantiation architecture which make enable a user to get access to username, password, certificates, and smart card.

Wimax mobility

The basic and most important feature of Wimax technology is to support mobility applications as VoIP. The power saving mechanism of Wimax technology is used to extend the battery life of handheld devices. It supports mobile applications including channel estimation, subchannelization, power control etc.

To get access to wimax base station is not a huge task now because the wide range of connectivity of wimax provides access to base station from home. Installation of hardware is also very easy with wimax technology. With the growth of Wimax technology its more feature may also comes up.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Sunday, 23 December 2012

HOW WIMAX WORKS



If you already read about Wimax Technology then next question arises to your mind is How WiMAX works. Wimax is a telecommunication and mobie technology used for broadcasting of wireless data by the use of a number of transmission methods. Wimax stands for Worldwide Interoperability for Microwave Access offering internet access point to point or point to multipoint or path. It is based on IEEE 802.16.

Wimax make possible the broadband access to conservative cable or DSL lines. The working method of Wimax is little different from Wifi network, because Wifi computer can be connected via LAN card, router, or hotspot, while the connectivity of Wimax network constitutes of two parts in which one is Wimax Tower or booster also known as wimax base station and second is Wimax receiver (Wimax CPE) or Customer Premise Equipment.

The Wimax network is just like a cell phone. When a user send data from a subscriber device to a base station then that base station broadcast the wireless signal into channel which is called uplink and base station transmit the same or another user is called downlink. The base station of Wimax has higher broadcasting power, antennas and enhanced additional algorithms.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Wimax 2 Technology




   
What is WiMAX 2 Technology

Wimax 2 technology is improving with IEEE 802.16m mobile WiMAX standard which is called Wimax 2. Well Known business leaders make known proposal to speed up WiMAX 2 Technology Solutions built upon IEEE 802.16m. Wimax 2 Technologyis in the final stage of approval and Wimax 2 technology may be beginning in 2011 and possibly commercialized or deployed in 2012.

The objective behind Wimax 2 Technology are technology cooperation and mutual presentation benchmarking, dual testing of 4G Technology requests over WiMAX 2 Technology solution, and untimely network level interoperability testing. There are several companies have announced Wimax 2 technology with primary benefit of speed such as ZTE, Intel, Samsung, Motorola. There are two type of existed Wimax , Fixed wimax which is faster than WiFi. Mobile Wimax, which is used as a substitute of 3G mobile phone operator. Mobile wimax can transport 144 Mbps download and 35Mbps uploading speed. While wimax 2 technology intended to meet the advanced 4G standards within 1Gbps access speed to fixed and 100Mbps to fast moving mobile nodes.

Specifications of WiMAX 2 Technology

The specifications of Wimax 2 technology are built on Institute of (IEE 802.16m) Electrical and Electronics Engineers 802.16m standard. It includes high level on air competency. The Wimax companies said that Wimax 2 standard will be diffident well-matched with preceding standards. While wimax network technology are confirming extremely successful in carrying very high bandwidth mobile service. The future of wimax is guaranteed by Wimax 2 technology which is a powerful new era technology within a long term growth. WiMax 2 Technology will be much faster than its forerunner.

Features of WiMAX 2

=>Wimax 2 technology can deliver a blazing speed of 120Mbs down and 60Mbs up without any restriction. Wimax 2 Technology using 4×2 MIMO antennas with 20 MHz channel enable signals every where.

=>With the Wimax 2 technology the speed will be double and user can do much than before.

=>The Wimax technology offer very high speed including numerous devices for the delight of client.

=>Wimax 2 technology network offering VOIP calls with 300Mbit throughput including more bandwidth and less latency.

=>Those operators facing explosive the Wimax 2 technology meets the need of their demand for mobile data and various type of services.

=>Wimax hardly print out a place in the market therefore Wimax 2 technology offering a low cost network within all IP mobile solution.

=>The Data, Voice, and Video transmission are the clear efficiency of Wimax 2 network technology.

=>The Wimax 2 promises to deliver a matchless amalgamation of linearity, noise presentation, and constituent combination over a remarkably broad band of operation.

Intel, Samsung which are top leaders in driving the IEEE 802.16 standard contented to observe well-built ecosystem hold up promising to speed up WiMAX 2 competences. WiMAX forum is working to speed up the accomplishment of interoperable system outline for WiMAX 2 apparatus and devices. According to Wimax Forum there are about 559 wimax deployments in 147 countries and 620 million people using it globally. Almost 130 devices and 60 base stations within 2.3 GHz, 2.5 GHz, and 3.5 GHz have been certified by Wimax forum.

The combination of Samsung, Motorola, and Intel form a coalition called Wimax 2. The goal of Wimax 2 is to improving the financial side of mobiles The Wimax group will concern detailed information about milestones and delivery schedule almost in next 3-6 months with the objective to support the WiMAX Forum. Wimax Forum is convinced that Wimax 2 Technology will speedily enlarge the Mobile WiMAX market to make available mobile Internet to community in all corners of the world. In short Wimax 2 meets the International Telecommunications Union requirements for advanced 4G Technology. It transporting higher capacity to system and highest point rate of in excess of 300 Mbps within inferior latency and augmented Voice in excess of Internet Protocol or "VoIP" capacity.

Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Saturday, 22 December 2012

10 things you need to know about iPad and iPhone development



 If you’re getting ready to jump into iOS development, these practical insights will help you get started.

With the iPad’s domination of the tablet space and the iPhone continuing to enjoy strong sales, interest in development for these two platforms keeps growing. I have started getting my head into the iOS development game. These are 10 things I have learned about developing for iPads and iPhones along the way.

1: You need a Mac

It may sound like a conspiracy theory to get folks to buy Macs, but without a Mac you won’t be able to get your application onto a device for testing. And you need to be testing on a device.

2: You really should get an iPad and an iPhone or iPod Touch

Yes, there is a simulator. But the truth is, simulators only go so far in replicating the experience a user will have. In my experience with mobile application development, even “simple” applications can be a joy to use in the simulator and a hassle on an actual device. And since you’ll likely want your application to work well both for iPhone/iPod Touch and iPad, you will want to get an iPad and either an iPhone or an iPod Touch (the two are identical as far as development is concerned).

3: Objective-C is a bit of a throwback

My first reaction at looking at Objective-C was, “Wow, that looks like the stuff I cut my teeth on!” And I was right. While Objective-C supports modern programming elements like object-oriented code, it is a fairly low-level language, too, and it clearly has not strayed too far from C. For example, you need to prototype functions in a .h file. It reminds me in many ways of the Delphi system in that regard.

4: XCode is radically different from Eclipse and Visual Studio

Coming from the Visual Studio system, with a couple of minor detours into Eclipse, I found XCode to be a bit jarring. The focus is really less on everything that happens in the toolbars, sidebars, and menus, and more on what happens in the middle of the screen, which is writing code as text. This isn’t to say that XCode isn’t visual or that it lacks tools. But the overall system simply has a different philosophy from the kitchen sink approach that Eclipse and Visual Studio take.

5: XCode is ready to work with Subversion or Git

Out of the box, XCode comes equipped to work with Subversion or Git. You are still free to use any other source control system you want (through command-line tools, if they don’t have a GUI system or XCode integration). But if you already use Subversion or Git, you will be happy.

6: You should sign up for your developer account early

It can take up to two weeks for your developer account to be approved. The sooner you sign up, the sooner you will be able to get your app deployed to your test devices or uploaded to the App Store for approval.

7: There are different types of developer accounts

Developer accounts come in three major flavors: individual, company/organization, and enterprise. The main difference between individual and company/organization is that the latter allows you to create users within the account who can access it. Individual accounts are limited to a single user. Enterprise accounts are an entirely different beast: They allow for private deployments, which is exactly what an IT department writing apps for internal use needs. There is also an academic account for students, which allows some access to the developer program.

8: You can write code without a developer account

The good news is, if you are just learning, and are willing to forego deployment to a test device or putting your app in the App Store, you can use XCode and the iOS simulator without a developer account. The developer account has lots of benefits, including early access to betas and such, but for learning purposes, no account is needed.

9: iPads are not just big iPhones

When designing UIs, it’s tempting to think that iPads are just large iPhones. While this is more or less true at a code level (apps that run on iPhone will run on the iPad, though iPad-specific apps will not run on iPhone), it is a big mistake for designing the UI. An iPad’s bigger screen allows you to pack a lot more information on the screen without overwhelming the user, and the larger screen size will affect what kinds of UI widgets can be comfortably used.

10: There are alternatives to Objective-C and XCode

If, for whatever reason, you do not want to work with Objective-C and XCode, that is just fine. A wide variety of other options are available for iPad and iPhone development work. You can run C in MonoTouch or use HTML and JavaScript in Titanium (or a number of other systems) —  and those are just two of the more well-known options. Once you stop working in Objective-C, you do not need to be using XCode, either.

Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Wednesday, 19 December 2012

Types of Wimax Technology

Types of WiMAX Technology (802.16)

The Wimax family of standards (802.16) concentrate on two types of usage models a fixed wimax usage model and a mobile wimax usage model. The basic element that differentiates these systems is the ground speed at which the systems are designed to manage. Based on mobility, wireless access systems are designed to operate on the move without any disruption of service; wireless access can be divided into three classes; stationary, pedestrian and vehicular.
  
A mobile wimax network access system is one that can address the vehicular class, whereas the fixed wimax serves the stationary and pedestrian classes. This raises a question about the nomadic wireless access system, which is referred to as a system that works as a fixed wimax network access system but can change its location.

Fixed Wimax

Broadband service and consumer usage of fixed Wimax access is expected to reflect that of fixed wire-line service, with many of the standards-based requirements being confined to the air interface. Because communications takes place via wireless links from wimax Customer Premise Equipment (wimax CPE) to a remote Non Line-of-sight (NLOS) wimax base station, requirements for link security are greater than those needed for a wireless service. The security mechanisms within the IEEE 802.16 standards are sufficient for fixed wimax access service.

Another challenge for the fixed wiimax access air interface is the need to set up high performance radio links capable of data rates comparable to wired broadband service, using equipment that can be self installed indoors by users, as is the case for Digital Subscriber Line (DSL) and cable modems. IEEE 802.16 standards provide advanced physical (PHY) layer techniques to achieve link margins capable of supporting high throughput in NLOS environments. 

Mobile Wimax

The 802.16a extension, refined in January 2003, uses a lower frequency of 2 to 11 GHz, enabling NLOS connections. The latest 802.16e task group is capitalizing on the new capabilities this provides by working on developing a specification to enable mobile Wimax clients. These clients will be able to hand off between Wimax base stations, enabling users to roam between service areas.

Wimax backhaul

Wimax backhaul is actually a connection system from the Access Point (AP) back to the provider and to the connection from the provider to the network. A wimax backhaul can set out any technology and media provided; it connects the system to the backbone. In most of the Wimax deployments circumstances, it is also possible to connect several wimax base stations with one another by use of high speed wimax backhaul microware links. This would also allow for roaming by a Wimax subscriber from one wimax base station coverage area to another, similar to roaming enabled by cellular phone companies.

There can be two cases of portability; full mobility or limited mobility. The effortless case of portable service involves a user transporting a Wimax modem to a different location. Provided this visited location is serve by wireless broadband service, in this scenario the user re-authenticates and manually re-establishes new IP connections and is afforded broadband service at the visited location. 

In the fully mobile scenario, user expectations for connectivity are comparable to facilities available in third generation (3G) voice/data systems. Users may move around while engaged in a broadband data access or multimedia streaming session. Mobile wireless systems need to be robust against rapid channel variation to support vehicular speeds.

There are significant implications of mobility on the IP layer owing to the need to maintain rout-ability of the host IP address to preserve in-flight packets during IP handoff. This may require authentication and handoffs for uplink and downlink IP packets and Medium Access Control (MAC) frames. The need to support low latency and low packet loss handovers of data streams as users transition from one base station to another is clearly a challenging task. For mobile data services, users will not easily adapt their service expectations because of environmental limitations that are technically but not directly relevant to the mode of user. For these reasons, the network and air interface must be designed to anticipate these user expectations and deliver accordingly. 

Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth


The 5 best IT certifications of 2013

1: VMware Certified Professional (VCP)

VCP is not that other contagious ‘V’ disease. It’s all about Virtualization. There has got to be at least a few times a week I hear from clients, co-workers and IT forums about virtualization. Even browsing LinkedIn groups and job descriptions, it pops up as that in-demand skill for 2013 and beyond.
Look at virtualization from a macro view. The IT OEM industry as a whole shows trends of enterprise technology becoming more powerful and faster every quarter, if not every day. Think about it; the more companies grow and keep up with demands on technology to run the business operations, they are maximizing their IT infrastructure and investments. For example, server farms and monitoring solutions, business applications and others are driving the demand to increase virtualization across multiple servers. This growing need should result in strong demand for IT professionals with specialization in virtualization. And who is the leader in this space? VMware. The VMware Certified Professional will be a sought after commodity to address this virtualization demand.
Does it Pay?
VCP (VMware Certified Professional) seems to separate those from the pack financially as a common factor among 80% of jobs seeking this skill set. I don’t have an average – but if its hot and in-demand, organizations will pay.

2. Certified Information Security Manager (CISM)

What is not a security issue these days? We all know IT infrastructure and security continues to get more complex every month, and over the last 3 years alone there hasn’t been as much interest around IT security than there has been over the last decade.
Companies recognize the need for highly skilled IT managers who know their way around today’s technology, security issues, and how they relate to business objectives. The Certified Information Security Managers IT certification directly correlates with this and the advancing complexities of IT infrastructure.  As a result, the CISM has exploded as must-have for IT manager level certifications, and a solid credential to have under your belt to move up the ranks to Senior IT Manager or CTO executive level positions.
Does it Pay?
Certified Information Security Managers (CISM) range in the +$110,000, with many IT Managers holding this certification. So if you’re a Network and System administrator seeking to capture that promotion to IT Management, you might want to think about this certification.
And again if its hot and in-demand, companies with high security concerns – financial industry, public companies, etc. will value the investment in someone with CISM and this security expertise.

3. Enterprise Administrator Windows Server (MCITP)

I caught this one and added it to my list from a recent IT hiring spree, a global IT survey. After mulling over this one I realized how undervalued, but over-needed this certification is in today’s IT market.
Think about this…what percentage of the global business markets are running on a Windows server? Sure, Apple sales are hot and eating into the business application world,  but unless your in a design shop or your people are dominated with Apple hardware and apps for business, the majority of businesses today are still windows server driven, or tied-in on the back-end to a windows server.
The Enterprise Administrator Windows Server certification (MCITP) is a solid skill set that countless small businesses and enterprise organizations want, need and require! Experts with MCITP shouldn’t have much time finding opportunities needing windows server administration.
I hate to admit this one as I’m a pro-Apple user, but organizations have the need for Windows server experts…and they are not always easy to find…and they can compensate very nicely for those who do. It is still a Windows world and the windows server is the heart of the business operations. And if your MCITP certified, you’re the insurance and surgeon to the business and its operations…and both should pay handsomely.
Eric Eckel, author at TechRepublic authored his highly recommended top 10 best certifications of 2012, had said it best that IT professionals with MCITP (Microsoft Certified IT Professional) for Enterprise Administrator on Windows Server 2008 accreditation demonstrate significant, measurable proficiency with Active Directory, configuring network and application infrastructures, enterprise environments, and the Windows 7 client OS.
Eric’s post at TechRepublic went on to give honorable mentions for the top spot including the MCITP: Virtualization Administrator on Windows Server 2008 R2 and MCITP: Enterprise Messaging Administrator on Exchange 2010.
The rational; Microsoft Exchange owns the SMB/E (small and medium sized businesses or enterprises. Virtualization initiatives are only getting started and will dominate technology sectors for the next decade at least (agreed!).
Does it Pay?
IT Managers and system administrators who can knowledgeably navigate Microsoft’s virtualization and email platforms will only grow in importance.
The  MCITP (Microsoft Certified IT Professional) may not have skyrocketed in popularity but it is one of those niche and essential certifications that not enough IT professionals carry. Both small businesses and enterprise organizations alike will find the MCITP expertise a lifeline to their business.  So what’s the pay? General range from $80,000 to $105,000.

4. Microsoft Certified Technology Specialist (MCTS)

How did MCTS make my top 5 best IT certification list of 2013? Among a few criteria’s that weighed it into my list, the deciding factor was the Windows business world reality. Microsoft’s library of servers and operating systems are still the most widely adopted hardware and software in the business world.
And while there are countless levels of  Microsoft certifications ranging from Microsoft Technology Associate level certifications all the way up to Microsoft Certified Architects, the MCTS (Microsoft Certified Technology Specialist) certification applies to the everyday operations and server issues most IT professionals deal with.
IT professionals with MCTS certification show a level of IT expertise many IT professionals can only desire to attain. Those who have already dealt with some form of Microsoft certification can attest to the countless exams MCITP requires. But once attained, organizations in the Windows world do understand the value and reputation the MCTS accreditation offers.
Does it Pay?
The Microsoft Certified Technology Specialist certification is one that compliments and builds on some of the above listed accreditation’s, and give you that salary boost. Depending on experience and role (IT consultant, support technician, system administrator, IT Manager, etc.), the MCTS shows a range of $58,000 to $78,000. But this is one of those IT certifications that provides the IT professional ‘the edge’, meaning how you leverage the weight of its value in your salary negotiation or raise, will determine its…and your worth.
And remember, it IS a Windows world.

5. Cisco Certified Network Associate (CCNA)

You don’t need to look far these days to see just how wired people are. Maybe it’s a technology addiction or a shift in business environment mentality, but it’s a sign of the times. It is obvious to see just how wired companies have become; iPhones, Blackberries, iPads, Androids, laptops, etc., there is trend towards moving away from traditional PC’s, and why a Cisco Certified Network Associate certification is an appealing accreditation for 2013.
The shift in how businesses approach communication and connectivity is the strongest indicator of how they are changing the way they do business. This also means how IT operations, systems and networks will structure their IT enterprise infrastructure towards wireless and security (hint: CCNA security certification).
IT professionals with Cisco Certified Network Associate (CCNA) certificates will show businesses the expertise in the ability to setup, troubleshoot, monitor, upgrade and maintain networking hardware based on Cisco equipment.
Does it Pay?
The Cisco Certified Network Associate (CCNA) certification shows a salary range far to wide to post here. However, a strong indicator on its value I found was many senior level system and network administration careers, and IT Manager positions all highlighted this skill set. Again, another 2013 IT certification that may give you the edge.
Whether you are an IT manager, IT support, Network administrator, engineer, or system support technician, and your daily operations involve server monitoring, business application support, managing email server issues (and the list goes on), you cannot go wrong positioning your career path in 2013 with these IT certifications.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Wimax Technology





What is Wimax Technology

Wimax stands for Worldwide Interoperability for Microwave Access. Wimax technology is a telecommunications technology that offers transmission of wireless data via a number of transmission methods; such as portable or fully mobile internet access via point to multipoints links. The Wimax technology offers around 72 Mega Bits per second without any need for the cable infrastructure. Wimax technology is based on Standard that is IEEE 802.16, it usually also called as Broadband Wireless Access. WiMAX Forum created the name for Wimax technology that was formed in Mid June 2001 to encourage compliance and interoperability of the Wimax IEEE 802.16 standard. Wimax technology is actually based on the standards that making the possibility to delivery last mile broadband access as a substitute to conventional cable and DSL lines. 

Wimax (802.16) technology often misinterpreted by the people by the names of mobile WiMAX, 802.16d, fixed WiMAX and 802.16e. Actually 802.16-2004 or 802.16d is developed by the third party as a standard and it is also referred to called as Fixed WiMAX because this standard is lacking behind just because of the non-mobility feature that’s why it’s often called as Fixed WiMAX. During the maturity period of Wimax (802.16) technology some of the amendments were made to the above mentioned 802.16d and they referred this amending standard as 802.16e. 802.16e introduced mobility and some other features amongst other standards and is also known as Mobile WiMAX.

Less than one out of five people of the developed world and an even smaller, little percentage of people across the world have broadband access today. Existing technologies such as Digital Subscriber Line (DSL), cable, and fixed wireless are overwhelmed by expensive installs, problems with loop lengths, upstream upgrade issues, line-of-sight restrictions, and poor scalability.

Wimax (802.16) is the next stage to a broadband as well as a wireless world, extending broadband wireless access to new locations and over longer distances, as well as considerably reducing the cost of bringing broadband to new areas. Wimax (802.16) technology offers greater range and bandwidth than the other available or forthcoming broadband wireless technologies such as Wireless Fidelity (Wi-Fi) and Ultra-wideband (UWB) family of standards. It provides a wireless alternative to wired backhaul and last mile deployments that use Data Over Cable Service Interface Specification (DOGS1S) cable modems, Digital Subscriber Line technologies (DSL), T-carrier and E-carrier (Tx/Ex) systems, and Optical Carrier Level (OC-x) technologies. (Jiffy Networks, 2006)

The general initiative of metropolitan area wireless networking, as envisioned with 802.16, begins with what is called fixed wireless. A backbone of base stations is connected to a public network, and each base station carries hundreds of fixed subscriber stations, which can be both public hot spots and fire-walled enterprise networks. Later in the development cycle of 802.16e, Wimax (802.16) is expected to encourage mobile wireless technology specifically wireless transmissions directly to mobile end users, This will be similar in function to the General Packet Radio Service (GPRS) and the one times Radio Transmission Technology (RTT) offered by mobile phone companies.

New organizations as well as individuals are increasingly adopting broadband, whereas those already using broadband are becoming dependent on it and are demanding better services with added benefits. To support this exceptional new demand, Wimax (802.16) has emerged as a feasible solution, because of its inherent features that holds great promise for the future of wireless communications. (Teri Robinson, 2005)

There has been a lot of excitement about Wimax (802.16) and the impact that this standards based wireless network technology will have on the broadband access market. All this hype has generated great expectations, and the industry has responded with exceptional aggression and commitment toward taking broadband to the next level with Wimax (802.16).

How WiMAX Works

The backhaul of the Wimax (802.16) is based on the typical connection to the public wireless networks by using optical fibre, microwave link, cable or any other high speed connectivity. In few cases such as mesh networks, Point-to-Multi-Point (PMP) connectivity is also used as a backhaul. Ideally, Wimax (802.16) should use Point-to-Point antennas as a backhaul to join subscriber sites to each other and to base stations across long distance.

A wimax base station serves subscriber stations using Non-Line-of-Sight (NLOS) or LOS Point-to-Multi-Point connectivity; and this connection is referred to as the last mile communication.  Ideally, Wimax (802.16) should use NLOS Point-to-Multi-Point antennas to connect residential or business subscribers to the Wimax Base Station (BS). A Subscriber Station (Wimax CPE) typically serves a building using wired or wireless LAN. (Steven J. Vaughan-Nichols, June 2004).


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

Monday, 17 December 2012

10 new HTML5 tags you need to know about



HTML5 offers new tags and attributes that provide more power, efficiency, and flexibility for your Web development. Here are 10 tags you’ll want to check out.
HTML5 brings a host of new elements and attributes to allow developers to make their documents more easily understood by other systems (especially search engines!), display data more uniquely, and take on some of the load that has required complex JavaScript or browser plug-ins like Flash and Silverlight to handle. Here are 10 new items in HTML5 that will make it easier for you to write your Web sites.

1: <video> and <audio>

One of the biggest uses for Flash, Silverlight, and similar technologies is to get a multimedia item to play. With HTML5 supporting the new video and audio controls, those technologies are now relegated to being used for fallback status. The browser can now natively display the controls, and the content can be manipulated through JavaScript. Don’t let the codec confusion scare you away. You can specify multiple sources for content, so you can make sure that your multimedia will play regardless of what codecs the user’s browser supports.

2: <input> type attributes

The venerable <input> element now has a number of new values for the type attribute, and browsers do some pretty slick things depending on its value. For example, set type to “datetime” and browsers can show calendar/clock controls to pick the right time, a trick that used to require JavaScript. There is a wide variety of type attributes, and learning them (and the additional attributes that go with some of them) will eliminate the need for a lot of JavaScript work.

3: <canvas>

The <canvas> tag gives HTML a bitmapped surface to work with, much like what you would use with GDI+ or the .NET Image object. While <canvas> isn’t perfect (layers need to be replicated by using multiple canvas objects stacked on top of each other, for example), it is a great way to build charts and graphs, which have been a traditional weak spot in HTML, as well as custom graphics. And that is just a start!

4: <header> and <footer>

The <header> and <footer> tags are two of the new semantic tags available. These two tags do not get you anything above and beyond <div> for the actual display. But they will reap long-term rewards for your search engine efforts, since the search engines will be able to tell the difference between “content” and things that are important to the user but that aren’t the actual content.

5: <article> and <section>

The <article> and <section> tags are two more semantic tags that will boost your search engine visibility. Articles can be composed of multiple sections, and a section can have multiple articles. Confusing? Not really. An article represents a full block of content, and a section is a piece of a bigger whole. For example, if you are looking at a blog, the front page might have a section for the listing of all the posts, and each post would be an article with a section for the actual post and another for comments.

6: <output>

The new <output> tag is unique, in that it expects its content to be generated dynamically with JavaScript. It has a value attribute, which can be manipulated through the DOM with JavaScript to change what is displayed on the screen. This is much more convenient than the current ways of doing things.

7: <details>

It seems like every Web site needs to have an expanding/collapsing block of text. While this is easy enough to do with JavaScript or server-side code, the <details> tag makes it even easier. It does exactly what we’ve all been doing for years now: makes a simple block that expands and collapses the content when the header is clicked. The <details> tag does not have widespread support yet, but it will soon.

8: <figure> and <figcaption>

<figure> is a container for content (typically images, but it can be anything), and <figcaption> (which gets put inside the <figure> tag) provides a caption or subtitle for the contents of the <figure> tag. For example, you could have four images representing charts of sales growth within a <figure> tag, and a <figcaption> with text like “Year-to-year sales growth, 1989 - 1993.” The images would be shown next to each other with the text running below all four.

9: <progress>and <meter>

<progress> and <meter> are similar. You use <progress> for a task or a “measure how complete something is” scenario. It also has an indeterminate mode for something that has an unknown duration (like searching a database). The <meter> tag is for gauges and measurements of value (thermometers, quantity used, etc.). While they may look alike on the screen in many cases, they do have different semantic meanings.

10: <datalist>

The <datalist> tag acts like a combo box, where the system provides a pre-made list of suggestions, but users are free to type in their own input as well. There are tons of possible uses for this, such as a search box pre-populated with items based on the user’s history. This is another one of those things that currently requires a bunch of JavaScript (or JavaScript libraries) to handle but that can be done natively with HTML5.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

10 reasons Windows 8 will be painful for developers







Ever since the release of the Windows 8 Developer Preview, people have had a lot to say about the experience of playing with the new OS. But few folks are talking about the changes it represents for developers. Windows 8 is the biggest update to the Windows development model since the move from Windows 3.X to Windows 95. While there are lots of good things, there are also a lot of pain points. If you are looking to develop Windows 8 native applications with the new UI and WinRT API, be careful of these 10 things.

1: Market reboot

If you want your applications to be fully compatible with Windows 8 (including running on ARM CPUs), you’ll need to do a full rewrite in Windows 8/WinRT. This may be great for developers looking to break into markets with established players. But if you are the established player, you are suddenly back at square one.

2: The asynchronous model

Windows 8 development is highly dependent upon asynchronous operations for anything that is long running. While that may be a cute trick in some scenarios, it is downright frustrating in others (like trying to download a file). It isn’t just the work needed to handle the async call; it’s things like error handling and reporting problems back to the user. It requires a whole new approach to the UI from what developers (especially WinForms developers) are used to.

3: Lack of direct disk access

Windows 8 cuts off direct access to the system in quite a few ways, but the one that will hurt typical developers the most is the lack of disk access. Windows 8 follows an extreme isolation model for applications, and if your application requires access to data outside its own confined little world (including networked services you can access), you can forget about porting it to Windows 8.

4: Touch UI paradigm

Unless you have been writing a lot of mobile apps, shifting to the new UI style, which is designed for touch interaction, is going to pretty tough. It took me a long time to get a feel for what works well and what doesn’t. To make things more difficult, what looks and works well on a screen using a mouse and keyboard can be a poor experience with touch, and things that work well with touch often are a struggle to use on the screen. It’s a tricky balancing act, and as the uproar over the new UI in Windows 8 shows, even Microsoft is struggling to get it right despite having had a few years of experience with it.

5: Playing by Microsoft’s app store rules

If you want to be using the Microsoft app store, you will need to learn to play by its rules. While the rules are fairly reasonable, it will be a jarring experience if it is anything like the WP7 App Hub. For starters, Microsoft rigorously inspects the application and looks for all sorts of things, like unhandled exceptions and circular UI paths. Although this ensures a high quality app, it can be a surprise to developers. In addition, you need to work with an approval process. The details of the Microsoft application store are still under wraps, but recent experience with WP7 suggests that it won’t be fun.

6: Heavy emphasis on cloud

While there is no mandate to use the cloud, Web services, and other off-premise techniques and technologies, it is most definitely encouraged. Things like automatic syncing of settings and data between devices (regardless of how it is done) will become the rule, not the exception, and users will be expecting it. Windows 8 makes this easy (you can have your locally saved information synced automatically with Live), but you will want to be judicious about how you do it for sensitive data. Encryption and other privacy and security techniques will become more important than ever.

7: Shift to “contracts” and “interfaces” for interop

One unique aspect of the Windows 8 paradigm is the idea that applications can provide services to the OS (such as acting as a source of contacts or pictures), as opposed to just dumping the data into a common directory. This allows all sorts of sweet application concepts. But even though this is easy at the technical level, it’s difficult to figure out how to leverage at the conceptual level.

8: Market uncertainty

Now we get into the more high-level pains. Microsoft is clearly pushing Windows 8 for tablets and maybe even phones. Right now, we’re seeing Android struggle in the tablet space, and at the same time, the new Windows 8 UI has been heavily panned by people who have tried the preview versions.  Will the market adopt Windows 8 or reject it?  Will the tablet market for Windows 8 take off? These are all questions that won’t be answered until it is far too late to be a first mover in the market. If you are going to bet on Windows 8, you simply can’t properly assess the risks right now.

9: Lack of tablet hardware

For developers, not having tablets to try Windows 8 on has been a major problem. Yes, we’ve seen some tablets on Web sites, but not in person. Some (like the Lenovo Twist and some of the Samsung slates) resemble current devices enough that you can use what amounts to their predecessors to test. Others (especially the ARM devices) are just too different from existing products to allow a comparison, so you have to wait until October 26 to get an idea of what they are like on real hardware.
There has been no good way to get an idea of what the user experience will be like for your applications on those tablets. Not just in terms of the UI either, but of performance. Can the tablet CPUs run your app well? Is it too “chatty” for a device on a cellular connection? Are you using more storage than makes sense for the typical tablet we’ll see? Without a few tablet models easily available, we don’t know the answers here.

10: The trail of dead tech

This is the one that really breaks my heart. Microsoft has a history of pushing a technology as “the next big thing” and then leaving it dying on the vine a few years later. We don’t know if Microsoft will back off its Windows 8 strategy before launch, right after launch (Kin), or a few years down the road (Zune, Silverlight). If the new Windows 8 paradigm is not a success, Microsoft may very well change course in a way that renders all your hard work on Windows 8 native applications a waste of time.


Thanks & Regards,

"Remember Me When You Raise Your Hand For Dua"
Raheel Ahmed Khan
System Engineer
send2raheel@yahoo.com
send2raheel@engineer.com
sirraheel@gmail.com
send2raheel (skype id)

My Blog Spot
http://raheel-mydreamz.blogspot.com/
http://raheeldreamz.wordpress.com/

My Face book pages
http://www.facebook.com/pages/My-Dreamz-Rebiuld-our-nation
http://www.facebook.com/pages/Beauty-of-islam
http://www.facebook.com/pages/Health-is-wealth

what is Juice Jacking SCAM

  Juice Jacking is a cybersecurity threat that occurs when cybercriminals manipulate public charging stations, such as USB charging ports in...