The features of USB 3.0 - such as rapid data transfer rates, and backward compatibility with USB 2.0/1.1 devices that users do not need to adjust to - has led to uptake rates rising rapidly from 20-30% last year to 80-90% in a short space of time. However, there are limitations to the interface breaking into the industrial control sector, such as the lack of a performance boost for existing USB 2.0 devices and the three-meter effective data transmission range. To this end, VIA (VIA Labs), the global leader in USB 3.0 technology, has focused on launching USB2Expressway and other technologies, in order to reduce equipment costs and simplify complex installations, so as to achieve the goal of improving system equipment performance.In the past, virtually every peripheral device for PCs used a different interface with its own distinct specifications. These interfaces were not only incompatible with each other, but also required users to invest a great deal of time and energy in installing and configuring drivers before the devices could be used, making them difficult to manage and inconvenient to use; moreover, businesses might squander enormous amounts of manpower and time costs as a result. As its name would suggest, the Universal Serial Bus (USB) interface was intended to solve the problems of different versions of specifications and the various issues associated with the various interfaces by creating a consistent industry standard for electronic device data transmission interfaces.As David Hsu, product marketing associate VP at VIA Labs, noted during the Digitimes Embedded Technology Forum in February, USB specifications quickly made their presence felt in a variety of ICT products, including keyboards, mice, speakers, storage devices, video devices, all-in-one printer/scanner/copiers and industrial control equipment, largely because of their convenience in terms of ease of use, and of plug & play and hot pluggable functionality. Besides being a data transfer interface, a lot of devices even use it as a power socket. Hsu believes that in terms of market uptake, the USB specification could be described as the most successful application interface invention in human history.USB 3.0 is the major trend going forwardIT technology has continued to evolve year after year since the launch of USB, and as a result, specifications from earlier years are no longer up to the task. Improving specifications is therefore an essential requirement to cope with the operations of today's systems and devices. This is another reason why USB 3.0 has emerged, Hsu explained. As data transfer rates for USB 3.0 are more than ten times faster than USB 2.0, and it also allows two-way transfer, it delivers a marked increase in performance. It is also backward compatible with USB 2.0/1.1 devices and requires no changes as far as the user's habits are concerned. These advantages have enabled USB 3.0 uptake to soar in a very short space of time. USB 3.0 uptake rates were only 20-30% last year, but are likely to hit 80-90% this year, noted Hsu.Major ICT hardware manufacturers such as Intel, AMD and Nvidia have already started offering products with USB 3.0 interface specifications, and Microsoft's next generation Windows 8 will also follow suite by supporting built in USB 3.0 drivers. For the industrial control sector, in contrast to USB 2.0, USB 3.0 provides ultra-high bandwidth of up to 5Gbps, hot pluggable and plug & play functionality and support for hub structures that integrate multiple port connections, so that a single computer can simultaneously control many devices. For this reasons, manufacturers have already begun to adopt it.Challenges of bringing USB 3.0 to the industrial control sectorThe high performance of USB 3.0 is very helpful to factories or users in terms of increasing productivity. For example, since the launch of USB 2.0 in 1998, a 16MB SD memory card in those days was the same size as a 16GB card today, a thousand-fold increase in capacity. Data of this size would take 8 minutes 53 seconds to transfer at the 30MB per second transfer rate for USB 2.0; At the 300MB per second transfer rate for USB 3.0, the transfer takes just 53 seconds. For factory control devices that make extensive use of USB interfaces, production capacity and operational efficiency can be increased by a factor of ten. This is very beneficial in terms of making a factory more competitive, noted Hsu.However, before USB 3.0 can truly make its mark in the industrial control sector, there are a number of difficulties and problems that must be overcome, so it is unlikely that it will replace USB 2.0 overnight. Hsu explained that the main reason is that USB 3.0 solutions and customized designs are still pretty complex, and it's not yet at the mature stage where it can support a wide range of applications. The industrial control sector has also yet to fully master the design capabilities for USB 3.0 technologies, so the industry has produced only a handful of USB 3.0 device controller ICs designed specifically for industrial control or embedded platforms. The upshot of this is that USB 3.0 and USB 2.0 devices will continue to coexist in factories for some time yet.If factories were to switch to a setup using USB 3.0 computers with USB 3.0 hubs, it would still be impossible to improve the performance of existing USB 2.0 devices; in fact, performance would in some cases actually become slower. Hsu explained that when multiple USB 2.0 devices are operating at the same time, they share the single USB 2.0 bandwidth (480Mbps), making it difficult to get the best performance out of the devices (if there are four USB 2.0 devices, each device is only allocated 120Mbps of bandwidth), thereby affecting the overall production capacity of the factory. Moreover, standard USB 3.0 data cables are limited to a length of three meters, so the distance between devices is extremely small. Factories attach a great deal of importance to scale and efficiency, and always require that a certain distance is maintained between the devices being operated and the computer itself, said Hsu. These limitations constitute obstacles to USB 3.0 penetrating the industrial control sector.VIA Labs' solution: USB2Expressway and USB 3.0 AOCVIA has created its new USB2Expressway technology precisely to solve the problems discussed above. USB2Expressway applies VIA's USB 3.0 Enhanced HUB concept and unique, independently-developed U3TT chip technology to the "one to many" control model used in many industrial control sectors, providing ample bandwidth of 480Mbps to every USB 2.0 device connected downstream of the hub. USB 2.0 devices used to only be able to share the bandwidth provided by the USB 2.0, so even when no devices were using the USB 3.0 hub's bandwidth, the connections just sat idle. As Hsu explained, VIA's unique design means that USB 2.0 devices on its USB 3.0 Enhanced HUB can now use the up-to-5Gbps of USB 3.0 bandwidth. This makes it possible for every USB 2.0 device on the hub to access the full 480Mbps, markedly boosting device performance.The three-meter limitation for USB 3.0 signal transfer distances can also be extended by the V0510 high-speed optical receiver developed by VIA. When applied to a USB 3.0 active optical cable (AOC), the technology massively increases transfer distances to as much as 100-300 meters. The main reason we used optical fiber rather than active signal boosters is that optical fiber uses light as a medium, and so is not easily affected by the electromagnetic (EM) interference that would otherwise weaken the signal. Optical fiber also gives out no EM radiation of its own, making USB 3.0 more suitable for use in medical imaging, broadcast TV, digital sign boards, retail applications, and other applications where interference is particularly unwelcome.Hsu explained that VIA's exclusive USB2Expressway dedicated bandwidth technology not only simultaneously supports multiple USB 2.0 devices, but also provides each port with full and dedicated bandwidth, something that is extremely helpful for improving the overall operating performance of a system. Hsu noted that test data shows that when only a single USB 2.0 device is connected, both standard and USB2Expressway technologies give data transfer rates of around 35MB/s; when the number of devices connected is gradually increased to four, USB 2.0 devices connected via USB2Expressway on average still achieve the same level of performance, while the transfer rate for ordinary connections falls to around 10MB/s. Consequently, businesses don't need to spend large sums of replacing existing USB 2.0 devices in their factories, while complex installation and set up tasks can also be simplified. Users only need to replace their hubs.VIA Labs is headquartered in Taipei and has worked to further the industry ecosystem for USB 3.0 products since its inception in 2008. VIA is not only the first company in the world to have a full product line covering USB 3.0 hosts, hubs and devices, but has also maintained a longstanding and productive relationship with USB-IF and Microsoft, the two USB giants of USB standard formulation. The company has very strong logic and firmware development teams, and all the core technologies, components, digital circuits and software drivers for its host products were independently developed. Hsu explained that in the space of just a couple of years, VIA has developed four USB 3.0 product lines and successfully shipped them all. Not only does VIA provide USB 3.0 mass production solutions and complete product lines for well-known firms including Buffalo, Microsoft, Acer, Asus, MSI, Gigabyte and D-link, but the USB-IF organization even designated VIA hub products for use in the creation of Interop Trees and Backward Compatibility Trees.David Hsu, Product Marketing Associate VP at VIA LabsPhoto: DigitimesVIA Labs' explaining the advantages of its USB2Expressway technology to participants at the Digitimes Embedded Technology Forum in FebruaryPhoto: Digitimes
The new-generation UEFI firmware architecture offers an integrated environment to accelerate firmware development. Its functions not only cover legacy BIOS, but also improve boot-up performance, system security, cloud connection and intelligent technology support. The modularized firmware architecture and the up-to-date codes can greatly save the time for developing firmware to meet system developers' demand for multiple products in small volumes and diverse specifications.During the Digitimes Embedded Technology Forum in February, US-based Phoenix Technologies senior director of engineering, Terry Chen, stressed that Phoenix's competitive edge lies in the fact that it offers excellent firmware architectures and perfect development environments. Phoenix's UEFI firmware product, Phoenix SecureCore Technology (SCT) supports multiple embedded computing platforms, including embedded modules, motherboards, graphics cards and systems for vertically-integrated markets of industrial, networking, retail, medical, energy, military and gambling applications. For firmware development tools, Phoenix offers integrated development environment, development tool kits and professional debugging devices that form the best total solutions for clients' firmware development. A look at Phoenix's SCT firmware architecture chart reveals that the Foundation (Green H) is formed by the EFI-developed EDK1 with patches and EDK2. Above it is the Phoenix-architected Kernel. The next is the Executive layer where Phoenix adopts a clear modularized design. Different combinations of processors, chipsets and operating systems, as well as different system makers' firmware specifications, can be accommodated by modularized changes and partial upgrades.The System layer (UEFI protocols) at the top covers Common PEIM (Common Pre-EFI Initialization Modules) and DXE (Driver eXecution Environment). For the UEFI user interface and applications, it offers: Quick Book Technology, Leading Boot Manager Technology and Parametric Build for Maximum Configurability.Phoenix's technological strengths come from more than 30 years of experience in the BIOS field. Its Phoenix SCT meets the latest UEFI standards. Its modularized architecture and clear code tree can increase the reusability of the firmware's kernel, as well as accelerating the development of firmware projects to allow products to hit market shelves sooner.New features from Phoenix SCT 2.2Chen noted the major features of Phoenix SCT 2.2: support for the Windows 8 operating system; improvements to boot performance; better system integration; optimized user interface; wider support for peripherals; enhanced server management; compliance with the NIST SP800-147 BIOS Protection Guidelines; and support for common industry specifications such as UEFI 2.3.1, TCG 1.2/2.0, ACPI 4.0/5.0 and SMBIOS 2.7.He further pointed out its new features. First, the hardware IRQ (interrupt request) replaces the SMI (system management interrupt) of the legacy USB simulation function. Because there is no need to define the SMI, it will be easy for transplantation and offer wider support for USB devices already available on the market. Second, it supports a wide range of USB 3.0 xHCI controller chips. Third, it features Crisis Recovery for USB2/USB3.0 peripherals. Fourth, it improves compatibility with peripheral drivers through USB driver simulation and recognition of the SCSI instruction set. Phoenix SCT adds new boot devices/run-levels: first, UEFI boot driving the SCSI device directly from the legacy One-time PROM (OPROM), with maximum capacity of each boot partition reaching 2.2 TeraBytes; second, embedded system boot for the SD card, SD I/O interface port and CF ATA memory card; third, UEFI LAN boot from the network interface card; and fourth, legacy SA OPROM from the motherboard; and fifth, OPROM boot from additional interface cards. For other support, it redesigns the entire firmware code, clearing all confusing and clumsy platform codes to offer a simplified architecture for platform transplantation. It automatically indicates CMOS information and generates related codes during the building process.Phoenix SCT 2.2 also supports WinHost64 to execute 64-bit applications under the 64-bit Windows environment. And the design of the kernel simplifies the code, with capability of supporting prototypes of new versions of architecture. As for debugging, it features SBCS (single-byte character set) or DBCS (double-byte character set) Port 80 Enhanced PostCode output during driver input.Finally, the enhanced Milestone Task will easily achieve modularized customization without any previous versions being overwritten or fallen into disuse. In terms of embedded functions, it offers ATX power supply simulation to simulate legacy AT power supply hardware switch functions. It also supports multiple Serial I/O chips, two-phase password input, headless function without video, and recoverable errors.Planning for next-generation Phoenix SCT 3Chen pointed out that the ultimate aim of the next-generation Phoenix SCT3 is to have EDKII continue the performance and stability of Phoenix SCT2 products, with compatibility with x86 and ARM platforms to satisfy the needs of the cross-platform generation. At the same time, through the Hybrid Build System it will support both SCT 2.x and EDK2 architectures, as well as hybrid development environments to assist clients' smooth transition from SCT 2.x to SCT 3.0.He said that SCT 3 will be designed to meet the needs of the ARM, server, storage, embedded and tablet markets to optimize the boot-up time, ROM space and UEFI firmware functions.An integrated toolkit supporting burn, edit, R&D, debug and manufacturingChen introduced Phoenix's Tool Development Kits (TDK), which allows for fast cross-platform BIOS development. It gathers Flash Tools, BIOS Edit, RD Utility, QA Tool and MFG Tool into one single user interface to enable easier learning and customization. With it, unique firmware tools can be developed according to specific manufacturing and debugging needs.TDK adopts standard C language and open source API. It can also accommodate customized codes independently developed by OEM firms. For API, it offers binary-code operations, fast memory burns, read and write of files and disks, Control Panel input/output, BIOS communication, hardware access, milestones, debugging, BIOS library services, SVN access, security programming, and a C language library. It supports DOS, Windows, EFI Shell and Linux operating systems. TDK provides a single- interface development environment based on an open source, which facilitates the development of native execution, quick implementation, easy customization and fast debugging to reduce the R&D cost. Fifth-generation firmware kernel architecture offers accelerated project-base developmentChen concluded by noting that Phoenix CoreArchitect 5 (PAC5) works with all versions of Microsoft Visual Studio. One single toolkit can take control of the entire development process, making use of an interface design of intuitive icons to provide an easy and comprehensive firmware development and debugging environment. With a mouse-over pull-down menu, one can examine each and every module and development guideline for the entire project, and can also customize the setup in line with one's specific needs to accelerate the firmware development. PCA5 also supports EDK 2.3.1, BLDK (Intel Boot Loader Development Kit) and Phoenix SCT BIOS, offering a comprehensive firmware development debugging environment for the entire process, from Edit, Configure, Build, Flash, to Debug.Phoenix Technologies senior director of engineering, Terry ChenPhoto: DigitimesPhoenix demonstrates its SecureCore Technology at the DTF 2012 embedded technology forumPhoto: Digitimes
The development of processors has entered the generation of the "heterogeneous multi-core processor", in which the integration of different types of processors (such as GPU, DSP, general processors, etc.) in the same chip, is not only delivering strong compute-performance, but also bringing about several innovative changes in functionality and the control of embedded system products, according to AMD. However, during the transition, if R&D personnel continue to approach development following the working habits established for single-core processor designs, the effects that can be enhanced by adopting multi-core architectures will ultimately be quite limited.The situation is similar to that of the internal combustion engine which was rapidly applied in many different types of vehicles such as cars, trucks and motorcycles following its invention in the early 19th century, while in the 1990s, due to environmental protection and energy conservation concerns, the gasoline-electric hybrid internal combustion engine was invented. Starting from the advent of the first integrated-circuit-based embedded computer in the 1960s, a diverse range of applications has emerged which can now be seen everywhere from enterprise-based PCs, vehicles, industrial machines, and medical equipment, down to daily-use devices such as handsets, tablet PCs and TV set-top boxes (STBs). By 2011, integrated circuits had evolved into heterogeneous multi-core processors to satisfy demand for applications such as real-time video recording, 3D graphics and human-computer voice/physical interactivity, all of which involve complex or high-volume compute-workloads.The term "heterogeneous multi-core processor" refers to the integration of different types of processors (such as GPU, DSP, general processors, etc.) into the same chip with a structure designed to allow the internal processors to share the same main memory. AMD corporate, Vice President of Worldwide Business Management and Channel Marketing, David J Kenyon pointed out that, limited by physical conditions such as semiconductor manufacturing processes, power consumption and architecture complexity, traditional single-core processor architectures and the pursuit for higher clock frequencies to enhance performance have already reached their limit; therefore, makers have started to consider multi-core processor architectures by integrating multiple processing units into a chip to resolve performance bottleneck issues.Development of homogeneous multi-core processors facing bottleneckWhen makers designed multi-core processors in the past, most designs were based on homogeneous multi-core processor architectures, the advantage of which is that the design allows the operating system to assign workloads to any of the processing units at any time. However, although design-architectures in this category can deliver faster compute-performance compared to a traditional single-core processor, eventually performance gains become limited by power consumption and cannot be expanded further. In addition, the performance of homogeneous multi-core processor architectures is also deeply affected by whether software and/or individual workloads can be written to support multi-threading, which puts limits on future expandability." Based on the estimated development curve of homogeneous multi-core architectures, which is expected to gradually slow down in the end phase, current designs have already reached a plateau wherein performance gains are coming in smaller increments.In view of this trend, AMD has unveiled its latest-generation AMD Fusion accelerated processing unit (APU), which integrates a standard x86 technology-based multi-core central processing unit (CPU) with a graphics processing unit (GPU) offering system-level programmability via a DirectCompute graphics processor application interface and Open Computing Language (OpenCL) in a single chip. The design speeds up the efficiency of data transmission processes between cores, while assisting equipment makers looking to transition into the heterogeneous multi-core processor industry from a homogeneous multi-core processor environment."Not only are we helping to expand industry standards, we are also targeting applications including high-level frameworks, mid-range video/image/mathematic/science/physical computations and different types of compile and debug tools to make improvements." said Kenyon, who also pointed out that, "In the future, if equipment needs to process work with high-level logic and complex calculations, this can be handled by the CPU, while the GPU can be specifically responsible for resolving single high-density workloads that are related to graphics."Development of heterogeneous multi-core processors to spur new revolution in the industryIn addition to strong compute-performance, AMD's Fusion APU architecture also delivers several innovative changes in the functions and control of embedded products. "Due to the appearance of the heterogeneous multi-core processor, many software and hardware environments, as well as open standards, which are supported by such systems, have received overall improvements, allowing ideas that used to be unfeasible due to lack of compute support, to begin to be realized."Such ideas include the playback of high-definition quality movies, real-time voice recognition, human body interaction and gesture control, 2D to more realistic 3D image conversion, and high-volume real-time information transfers, said Kenyon, adding that "No matter if it's in handsets, entertainment, audio-visual, medical, consumer shopping, industrial equipment, military weapons or any other industry, the revolutionary effects of heterogeneous multi-core processors can already be seen."Kenyon believes that, "As demand for richer content continues to increase, heterogeneous multi-core processors are currently the only viable path for equipment players." By offering a powerful and efficient integrated processor as well as support for data parallelism, AMD's heterogeneous multi-core processor architecture can be quickly expanded to feature several hundreds of powerful compute engines to provide system acceleration. This is in contrast to homogeneous multi-core processor designs which are limited by their number of cores."Because heterogeneous multi-core architectures are still in the earliest stages of development, there is still considerable room for performance improvements targeting certain applications" said Kenyon. During the process of improvement, the only limitation will be the programming models used. But if R&D personnel continue to follow working habits and models for single-core processor architectures, any improvements will be quite limited.To assist partners to nurture a new culture and develop the ability to code faster and faster code, AMD has established the AMD Developer Central website (http://developer.amd.com/) targeting R&D personnel related to OpenCL and AMD Fusion APU-based products. In addition to providing tools such as software development kits (SDK), AMD gDEBugger, AMD APP KernelAnalyzer, AMD CodeAnalyst Performance Analyzer, x86 Open64 Compiler Suite and AMD APP Profiler, the website also has resources such as related topic forums, seminars, technical documents, components/libraries and example source code to allow interested parties to interact with others.Kenyon said, "We hope that this ecosystem of software, tools, and partner solutions will help the industry to simplify the creation, development and deployment of heterogeneous multi-core processor-based products."AMD's promise to partners for heterogeneous multi-core processorsAMD's accelerated parallel processing (APP) technology has already been broadly adopted in industries such as general networking, video-conferencing, medical imaging, smart signals, video surveillance, notebooks, national defense technology and green high-performance computing (green HPC). Kenyon pointed out, "Currently, typical application examples such as radiation tracking, scientific simulation, seismic wave imaging and real-time planetary models, have been the first to leverage AMD Fusion APU SDK and OpenCL tools to develop software that take advantage of the processing ability of GPUs to accelerate compute-performance."AMD Fusion APUs not only effectively combine low-power x86 cores via parallel processing, the built-in AMD Radeon discrete GPU also features extraordinary parallel processing performance. Furthermore, components can be combined and utilized in a variety of ways to meet the demands of different applications.As an example, US-based Emerson leveraged the power and flexibility of AMD's programmable APUs as part of an upgrade to the company's network of sensors and detectors from wired to wireless to achieve its goal of increasing management effectiveness. The company aimed to recalculate the required time for 100 nodes to reduce the time to below one minute. "The company used AMD's G series APU as a basis for the new platform and adopted TMT's parallel processing software to resolve the problem." When the project completed, the company was able to finish the calculation for 250 nodes in just 14 seconds.Currently, AMD has many partners in the software, motherboard or system industries, such as TMT, Sage, Viosoft, ArcSoft, Brown Deer, Caps, PolyCore and Zircom. Kenyon said, "Heterogeneous multi-core processors are the trend of the future and are bound to trigger a new wave of innovations and applications. To that end, AMD promises to continue to follow open standards such as OpenCL in order to provide partners with services to simplify development, while assisting them in effectively leveraging advanced parallel architectures and unleashing the full potential of integrated CPU/GPU designs, ultimately providing embedded device platform developers with superior performance and better implementations.David J Kenyon, AMD Corporate, vice president of worldwide business management and channel marketingPhoto: DigitimesPicture: AMD introducing the latest embedded systems with heterogeneous multi-core processor solutionsPhoto: Digitimes
The cloud computing wave stirred up by strong demand for tablet PCs and smartphones will also spread to the Internet of Things (IoT) in the future, creating new business opportunities that everyone will crave a share of. But many small- to medium-size businesses (SMBs), particularly those from the embedded industry, will find it unaffordable to build up their own private cloud architecture because of limitations in terms of operational scale, funding and human resources. Leading embedded platform developer Advantech, eyeing such business opportunities, has launched its industrial cloud services to address the market. With the services, embedded devices that are difficult to manage because of their scattered locations - such as digital signage and monitors - can be controlled, switched on and off, installed and upgraded through cloud computing. This can improve efficiency and lower maintenance cost.Industrial cloud enhances intelligent use of embedded devicesSpeaking at the recent "Embedded Forum" (part of the Digitimes Technology Forum - DTF series) Advantech Embedded Core Group project manager, CL Chiang noted that in 2010 Advantech already realized cloud computing would be the direction of future industrial development, but that no solutions specifically designed for embedded/industrial systems were readily available on the market. Therefore Advantech decided to work toward this direction by developing industrial cloud services to help its business partners construct their own industrial cloud. He identified the industrial cloud elements as follows: identification; manageability; interconnectivity; security protection; intelligent system; vertical applications; and biz model & services.Chiang also outlined Advantech's vision for embedded cloud computing. Under the existing embedded systems, an intelligent system with Internet connectivity and intelligent computing capabilities can be formed through industrial cloud build-in. The integration of the existing IoT and industrial cloud can turn connected devices worldwide into intelligent systems. The combination of the IoT and industrial cloud will give birth to "Smart Earth."Focusing on developing intelligent platforms industrial cloudChiang pointed out that the focus of Advantech since inception was on the development of intelligent platforms, such as computers on modules, embedded single board computers (SBCs), multiple I/O extension SBCs, industrial motherboards, and slot SBCs. Another direction was to provide assistance for developing industrial cloud services, and for the segment it has so far launched Industrial Cloud Center, Embedded Apps, Cloud Pro Guidebook and CloudBuilder.Cloud Pro Guidebook mainly offers detailed step-by-step instructions and in-depth analyses that, together with industry-specific cloud applications, embedded industry players can follow to quickly and correctly set up their own private clouds. There are three steps: first, Intelligent Devices; second, Network Performance and Security Protection; and third, Design for Industry, On-Demand Software and Embedded App.CloudBuilder was the main focus of the speech. Advantech has designed the One-Click Installation to Industrial Clouds, which can be applied to embedded systems for a wide array of purposes ranging from the medical, gaming, POS, vehicle, machine automation, factory automation, signage, military, marine to transportation.Embedded cloud applications to construct last mile of industrial cloudChiang then introduced members of Advantech's Embedded App series, each of which targets different industrial environments. Remote Monitoring is for surveillance purposes; Remote Desktop offers intelligent remote control; Remote On/Off allows the switching on and off of devices remotely; System Protection protects and licenses cloud software; and System Recovery restores cloud systems.He went on to introduce a medical cloud application: a bed-side management system for hospitals. This system has an interactive LCD display installed beside the patient's bed. When making their rounds, doctors can click on buttons on the display's screen to review freshly completed diagnoses and explain them to the patient. After the doctors have left, the patient can pick up the remote control to play games, watch TV and movies, surf the Internet or make telephone calls to friends and relatives via the system. It provides professional assistance to medical personnel, as well as better and more comfortable hospital stay for patients.Another example cited is a monitoring cloud: a police roadside monitoring system. The roadside monitoring system developed by Advantech employs a region-by-region management interface to allow personnel at the control center to operate all on the computer screen: they can switch between different regions, conduct necessary control and make sure each monitoring system is working normally. When there is perpetration or malfunction, technicians can be dispatched to conduct on-site repair. Some regional police stations have realized the benefits of industrial cloud services. They have adopted Advantech's cloud monitoring system and also employed the company for help in future maintenance work.Demonstration of industrial cloud applicationsChiang noted that the Advantech-developed SUSIAccess retains the flexible features of the company's previous SUSI API software suites and integrates central monitoring functions. The SUSIAccess menu displays big icons for Remote Monitoring, Remote- On/Off, Remote Desktop, System Recovery and System Protection. System control personnel can also implement various API (application programming interface) functions to facilitate full remote control and access to the applications. In the future, there is also a chance that mobile devices, such as smartphones, can be employed to check for any system anomalies.He also demonstrated the features of CloudBuilder, such as One-Click Installation, and Notification Center that can detect latest versions and download upgrades once Industrial Cloud Center is installed. Advantech can customize cloud services to meet clients' specific needs, such as Notification Center, Hardware Monitoring, Booting Manager, Smart Battery, SQflash Tools (flash burn tools) through back-end computers, tablet PCs and smartphones.All product lines support intelligent managementChiang mentioned that since March 2011 Advantech has equipped all its product lines with intelligent management chips. Business partners only need the Advantech-provided Embedded App of Remote Monitoring to enable a close monitoring of its devices. When Advantech releases new versions of drivers and firmware, the system will automatically tell the client. At the end of 2011, Advantech announced the ClopudBuilder solution for constructing an industrial cloud in three steps. The free CloudBuilder software is easy to install, maintain and use. Users only need to follow the instructions displayed on the screen and can complete the installation in three minutes.The next step of Avantech's cloud service development is to extend the services to more different sectors, such as intelligent medicare, POS kiosks, on-board computers, security surveillance and the IoT. It will release relevant APIs to allow more system developers to participate in the development of cloud service projects. It will also let Taiwan's embedded industry seize business opportunities from this wave of cloud computing revolution, and achieve sustainable development.Advantech embedded software manager Chiang Chin-lingPhoto: DigitimesParticipants show keen interest in Advantech's industrial cloud that explores the potential and huge business opportunities of the Internet of Things.Photo: Digitimes
The integration of the three major IT application technologies, embedded systems, the Internet of Things (IoT), and cloud computing architecture, will trigger an "intelligent system" revolution in the conventional embedded device market creating completely new life, work and decision-making experiences for business managers, employees and general consumers. Microsoft Corporation OEM Embedded Sales Manager, Daniel Li thinks that OEMs must seize the huge business opportunities generated by this revolution, and actively recruit the resources from mature development and management platforms already available on the market to develop the intelligent systems that businesses need. Only by doing so can they stand a chance to succeed in future competition.Li noted that the availability of low-cost, high-performance chips is promising huge growth potential for the global embedded device market. The use of embedded devices is becoming more and more common in professional fields, daily life and ordinary work environments. "For example, the various handheld devices ordinary people use, multimedia information kiosks, digital signage in hotels, medicare tools, point-of-sale (POS) terminals, on-board infotainment devices, and even desktop game consoles, etc. are all embedded devices of some kind."As embedded devices are already very common, a further step could be taken to link them to the increasingly popular IoT and cloud computing ecosystems. The front-end embedded devices perform real-time information gathering and context-aware functions, and through the intermediate IoT connection send the information to back-end cloud computing platforms. This forms a wider IT infrastructure that assists business players in conducting diverse analyses and developing value-added applications. Li thinks that this not only offers business players a brand new way of collecting and using information, while supporting their daily operations and enhancing their long-term planning; it also provides a brand new experience for both general consumers and business clients.Take digital signage for example. Such embedded devices support split screens to simultaneously provide diverse information on weather, tourism, stocks, advertising and news etc. With these devices, stores can also access back-end databases to check their inventories, as well as all previous transactions made during particular times each day. All promotional information can be shown on the front-end display while consumers can access the services offered by the system via such user interfaces as sensors or touchscreens.Intelligent systems will become the mainstream in the futureSuch an operating framework – where connected devices can (through wired or wireless networks) send real-time dynamic consumer information to managers and decision-makers of a business – is an upgrade from conventional embedded systems to intelligent systems that are more helpful to businesses. According to market research firm IDC, shipments of intelligent systems are experiencing staggering growth: currently their global market has reached 1.8 billion units with revenues of about US$1 trillion. The size has already exceeded the sum total of PCs, servers and mobile phones. By 2015, shipments of intelligent devices will more than double to reach four billion units, with revenues soaring to US$2 trillion.Li said, "In 2010, intelligent systems only accounted for 19% of all electronics devices, but by 2015 their share will increase to one third." In order to raise prices and profits for OEMs in the wake of this new market trend, Microsoft has made corresponding adjustments to its roadmap for Windows Embedded products. Microsoft supports cross-platform front/back-end operations for x86, ARM-based and other hardware architectures in view of the fact that a company may have diverse devices in different departments due to cost considerations and system requirements. Moreover, Microsoft also offers multi-function items and technology to allow companies to easily manage and analyze data, with which they can develop the customized and differentiated solutions that they need.As for how intelligent systems work, in general their work starts with the huge amounts of necessary data (such as speeds, altitudes, temperatures, remaining fuel levels, etc.) collected via diverse embedded devices. The data is transmitted in real time via various wired/wireless technologies (such as Wi-Fi, WiMAX, LTE and 3G) to cloud/information centers for integration and analysis. Managers can make related decisions and issue instructions based on the data, and these decisions and instructions can be quickly sent to designated devices via the devices' management platforms. The devices then can make adjustments according to the instructions and decisions received.Highly-integrated back-end IT infrastructure is needed for developing intelligent systemsLi said, "This means that the operation of complete intelligent systems needs the support of a highly-integrated back-end IT infrastructure (covering the Internet, cloud, security, database, devices, system management platform and others) to extend the general embedded devices all the way to the application systems of the company, allowing control personnel at the company to remotely manage and monitor the data and actual locations of each device. This can guarantee overall security and upgrade systems at the lowest cost possible." He stressed further, "Only by doing so can intelligent systems create bigger benefits and allow faster returns on the investments companies make in installing the systems."Because of this, Microsoft's Windows Embedded development team is now not only focused on the key areas concerning operating system platforms, but also improving data processing functions in order to enhance the capabilities of handling diverse data from embedded devices running on different platforms. Moreover, the development team is also working closely with others who are developing Windows Azure, SQL Server 2008 R2, etc., to enable the inclusion of cloud computing in this infrastructure to help companies make even more in-depth analysis of the data transmitted back from the embedded devices. The Windows Embedded Device Manager and Microsoft System Center Configuration Manager platforms can help IT personnel more easily monitor and control all intelligent systems deployed around their company's IT network. They can also fully solve such problems as unstable connectivity common in the IoT environment, and the complex work of identity management and data access authorization.For ordinary OEMs, Microsoft also offers its powerful development kits and a range of API tools (such as Windows Embedded Handheld, Windows Embedded POSReady and Windows Embedded Automotive) to help companies quickly customize various solutions according to the needs of different industries and platforms. With these tools, companies have a variety of ways of using intelligent systems. "For example, the ability to capture a user's full body movements allows users to operate a system through gesture or voice instructions. The Kinect body sensing technology that was originally developed for the Xbox 360 game console was officially included in Microsoft's Embedded series in February this year." Li noted, adding that developers only need to use Visual Studio to develop distributed applications to complete the entire deployment from devices to the cloud on a single platform.Makers must unleash creativity to develop intelligent systems from different perspectivesLi stressed that makers of future embedded devices should not only focus on hardware development. They must assume the standpoint of enterprise users and unleash their creativity to introduce total solutions incorporating the concepts and functionalities of intelligent systems. This is necessary for improving their products' overall value and enhancing their competitiveness in this highly competitive market. "But apart from the completeness of functions and architecture, makers must also consider such important factors as reliability, manageability, performance and product life cycle when developing solutions."Li said, "Currently Microsoft and its partners have already transplanted the core business processes and related application systems needed by different businesses – from retail, manufacturing to medicare – to this platform. For other embedded device makers, this means that they will no longer have to invest in infrastructure development. When there are upgraded versions in the future, the life and usability of their systems will be fully guaranteed." For example, when Microsoft launches its Windows 8, systems will be upgraded and enhanced directly. For makers, this means significant help for their products to stay competitive in the market.Microsoft Corporation OEM Embedded Sales Manager, Daniel LiPhoto: DigitimesLi answers questions about Microsoft's Windows EmbeddedPhoto: Digitimes
GIGABYTE TECHNOLGY Co. Ltd., a leading manufacturer of motherboards and graphics cards, recently announced their upcoming 7 series motherboards will boast GIGABYTE's industry leading Ultra Durable 4 technology, helping to safe-guard GIGABYTE motherboards from common everyday threats, including humidity and moisture, electro-static discharge, sudden power loss and high operating temperatures."With the launch of the first Ultra Durable motherboard in 2006, GIGABYTE revolutionized the motherboard industry by making quality the number one focus in our motherboard design," commented Henry Kao, Vice President of GIGABYTE Motherboard Business Unit. "Five and a half years later, GIGABYTE maintains our leadership in quality design by equipping our upcoming 7 series motherboards with even more lifespan enhancing technologies in the form of Ultra Durable 4."GIGABYTE Ultra Durable 4GIGABYTE Ultra Durable 4 motherboards embrace a range of exclusive technologies that guarantee DIY PC builders the absolute best protection for their PC, with built-in features that prevent common malfunction threats users encounter on a day-to-day basis.2X Copper PCBGIGABYTE Ultra Durable 4 design features 2X the amount of copper of a traditional motherboard for both the Power and Ground layers which dramatically lowers system temperature by delivering a more efficient spreading of heat from critical areas of the motherboard such as the CPU power zone throughout the entire PCB. GIGABYTE's Ultra Durable 4 also lowers the PCB impedance by 50%, which helps to reduce electrical waste and further lowers component temperatures. A 2X Copper layer design also provides improved signal quality and lower EMI (Electromagnetic Interference), providing better system stability and allowing for greater margins for overclocking.Humidity protectionThere is nothing more harmful to the longevity of your PC than moisture, and most parts of the world experience moisture in the air as humidity at some point during the year. GIGABYTE Ultra Durable 4 motherboards have been designed to make sure that humidity is never an issue, incorporating a new Glass Fabric PCB technology that repels moisture caused by humid and damp conditions. Using a new kind of PCB material which reduces the amount of space between the fiber weave, Glass Fiber PCB technology makes it more difficult for moisture to penetrate compared to traditional motherboard PCBs. This offers much better protection from short circuit and system malfunction caused by humid and damp conditions.Electrostatic protectionAll GIGABYTE Ultra Durable 4 motherboards use high quality IC microchips that are rated with higher electro-static discharge (ESD) resistance than traditional IC implementations. GIGABYTE Ultra Durable 4 motherboards use ICs with up to 3 times the ESD resistance levels compared to traditional ICs. This helps to better protect the motherboard, its components and the PC in general against potential damage caused by static electricity, a common threat to today's PCs.Power failure protectionIf your home experiences a sudden power outage for any reason, GIGABYTE Ultra Durable 4 Classic are equipped to ensure that you won't be dealing with a fatal malfunction. All GIGABYTE Ultra Durable 4 motherboards use patented DualBIOS technology to provide fail safe protection for the BIOS on your motherboard, automatically refreshing your BIOS from a back up version in the case of a critical power outage.GIGABYTE Ultra Durable 4 motherboards also feature special anti-surge ICs that protect your motherboard, and your PC, from any surge in power delivery that may occur, helping to ensure that your PC is equipped to deal with any kind of irregular and inconsistent power delivery.High temperature protectionGIGABYTE Ultra Durable 4 motherboards feature specially selected components that make your PC capable of higher temperatures in more extreme conditions, hile at the same time preventing your PC from overheating. Using all solid capacitors for the GIGABYTE Ultra Durable 4 range both reduces overall board temperatures while, making the motherboard more robust at higher temperatures. The choice of Lower RDS(on) MOSFETs also helps to reduce operating temperatures significantly compared to traditional MOSFET solutions. These component technologies combine to guarantee enhanced longevity and stability for your PC.To learn more about GIGABYTE 7-series Ultra Durable motherboards, please visit:http://www.gigabyte.com/microsite/306/7-series.htmlGIGABYTE Z77X-UD3H
The SSD controller market is rapidly expanding, creating huge opportunities for the semiconductor industry. However, the design of complex chips and ever-increasing performance/functional requirements has also posed a challenge in terms of costs and efficiency to the industry players. MIPS has brought up the idea of having the chips operated in a multi-threading CPU environment to address these design challenges. And in the future, the company expects to remain the leading provider of processor architectures and cores for digital home and networking applications, and will actively pursue development in the emerging mobile market.The SSD market has witnessed an amazing rate of growth thanks to a surge in popularity of lightweight and thin multimedia mobile devices, extensive applications of cloud computing and social app platforms, significantly increasing demand for audio and video data storage requirements. According to data compiled by Forward Insights, SSD shipments will climb to 120 million units in 2014 from only 20 million in 2011. Judging from the recent spree of acquisitions and mergers in the IT industry, a number of enterprises are showing their optimism about how owning SSD technologies will help create competitive advantage. SoC suppliers should seize the opportunity to win a seat in this changing market, providing various high value-added solutions, said Del Rodillas, director of vertical marketing at MIPS.SSD designed to strike a balance between size, energy consumption and performanceHowever, producing the best set of SSD controllers to meet the trends is hard. Developers confront a number of design challenges including raising the bar for I/O speed (the latest version of SATA Express delivers data transfer rate of 8-16Gbps), and enabling management support for more information, more complex features (such as hybrid storage devices for ultrabooks) as well as error data correction capability, encryption/ decryption and safety procedures, and data stream priority and control (such as, specifying a higher priority for voice traffic than for data). Those design issues will greatly increase the difficulty in designing SSD controllers. Ways to extend the performance of SSD controllers under limited cost and energy supply conditions will be the problem developers have to overcome in the first place, said Rodillas.Up until now, the CPU has been the most important computing engines for SSD controllers, Rodillas continued. However, having SSD controllers run with a conventional single-core CPU structure has encountered bottlenecks, such as, advancing the performance of real-time processing makes it unlikely to make further progress, Rodillas said. He explained that SSDs have too many features needed to operate in parallel (such as NAND layer adaption, power management, wear leveling, NCQ, bad block management, garbage collection, traffic management & routing, etc.), which a single-core CPU architecture is unable to cope with. Though some companies alternatively use multi-core CPU designs, this will only result in increases in silicon area and power.Facing intense competition and a more challenging market environment, device manufacturers - particularly those specializing in mobile communications products - are aware that the ability to make products at the lowest possible costs while meeting specific dimensions and performance, and energy saving requirements is the key to winning customers.Therefore, in order to assist the industry to solve the problems, MIPS has introduced its multi-threaded processor core enabling one CPU to process multiple threads at a time. The capability does maximize the utilization of computing resources and improve system performance, Rodillas indicated. The CPU acts as an already-high efficient single execution pipeline, which does not require additional counterparts, and simultaneous multi-threading helps it further improve performance efficiency, Rodillas said.MIPS multi-threaded CPU core can assist the industry to address these design challengesMIPS' multi-threaded core solution for embedded applications incorporates several hardware virtual processing elements (VPEs) containing thread contexts (TCs) - used to further increase the number of hardware threads. Therefore, through multi-threading, developers no longer need to add full hardware required for implementing another VPE.In addition, MIPS' multi-threaded core offers an efficient inter-thread communication mechanism for implementing high-performance data flow. Another capability is the zero-overhead interrupt support, which can be implemented through letting a thread "park" until an external event signals it to resume execution. These management tools allow users to execute instructions from multiple threads and ease management of real-time behavior. These unique features would be helpful for enhancing the quality of services (QoS) of end-market products.For example, a single-threaded CPU architecture consists of functional units such as MDU, ALU, LD/ST, ICU and MCU, as well as a translation lookaside buffer (TLB) responsible for virtual to physical address translations. Software must get the access to user and PRA registers to understand the CPU status. With the concept of virtual processors in the MIPS architecture, each virtual processor will obtain duplicate user and PRA registers, a TLB and program counter to share the resources of execution units.For software, the solution is tantamount to the existence of more than one complete processor, and can speed up the processing efficiency of the pipeline without increasing the area and power. Multi-threading can also be used in Linux-powered devices to develop multi-core programming solutions.MIPS has used the BrowsingBench benchmark from EEMBC - which provides an industry-accepted method of evaluating Web browser performance on smartphones, netbooks, portable gaming devices, navigation devices and IP set-top boxes - to evaluate the performance benefits of its multi-threading core solutions, according to Rodillas. Results show that Android-based web browsing performance is greatly enhanced by the technology. Within single-core CPU architecture, the MIPS multi-threading technology boosts the performance 43% through two VPEs. The performance will be 2X higher when the configuration is dual-core with two VPEs per core. A dual-core, 4-VPE system delivers 2.5X the performance of a single core. Therefore, a SSD operated in the MIP dual-core, multi-threaded CPU environment is able to see its efficiency improve 44%, save 57% more in power consumption and a dimension reduction of 43% when compared to rivals' fellow products. Systems operated in a single-core, multi-threading environment will be superior in terms of performance efficiency (up 55%), energy use (down 57%) and size (49% smaller).MIPS multi-threaded core adopted by many companies for their productsAccording to market watchers, 33% of the products shipping with mid-range to high-end MIPS cores are multi-threaded ones. For instance, Ralink's ADSL Internet access device is based on MIPS' multi-threaded architecture to enable more applications and ensure high QoS for its products. "With this architecture, Ralink is able to efficiently provide deterministic VoIP response and support for multiple applications in a single device," Ralink said.Mobileye also uses the MIPS multi-threaded core in the chips inside the company's driver assistant systems - the EyeQ2 vision-based SoC series - for luxury cars such as the Volvo S60 and BMW. "MIPS' multi-threaded 34K cores helped us achieve a 6x performance increase in the EyeQ2 vision chip over the prior generation," Mobileye said. In addiiton, the PWC-Sierra PM8013 maxSAS RAID-on-Chip controller comes with three multi-threaded MIPS cores. "The multi-threading MIPS 34K cores deliver performance higher than any other RAID solution," according to PWC-Sierra.MIPS is a leading global provider of industry-standard processor architectures and cores (Kernel), providing optimized structures that are cost efficient, meet their applications' scalability needs, and allow system developers to produce their next-generation solutions at lower risk. MIPS' architectures has become the standard in the embedded industry.According to the information provided by MIPS and industry analysts, MIPS provides the number one processor architecture for markets including digital TVs, cable/satellite/IPTV set-top boxes, Blu-ray Disc players, broadband consumer premises equipment (CPE) and Wi-Fi access points/routers. MIPS-based designs are integrated into tens of millions of products around the world.Rodillas stated that MIPS' growth strategy is targeting three key markets - the digital home, wired/wireless network and mobile communication. MIPS is committed to developing multi-core, multi-threaded and 64-bit technologies for high-end broadband CPE and WAN LAN applications. The company has also been investing heavily in the development of its ecosystem of partners dedicated to developing connected devices in order to maintain its existing lead in the digital home and networking market segment, Rodillas continued. Meanwhile, MIPS is currently working with a number of Android and 4G device companies, and has been actively developing its presence in the emerging mobile communications field looking to end the dominance of ARM architectures. Cisco, Hewlett-Packard, Huaya Microelectronics, Linksys, Microchip Technology, Motorola, Pioneer and Sony are among MIPS' important partners.Del Rodillas, director of vertical marketing at MIPSPhoto: DigitimesRodillas demos the latest SSD controller SoCs operated in a multi-threading CPU architecture at the MIPS boothPhoto: Digitimes
Embedded technology and its applications can be seen everywhere, whether is used in consumer electronics, industrial control, and the promising Internet of Things (IoT) market. Embedded technology and devices can be detected in all of the above markets and have been expected to experience tremendous growth in the future. Hence, to assist Taiwan-based firms to capture the essence of such technology, uplift the R&D power of Taiwan's IT industry, and to hold a symbolic market share, Digitimes held the "Embedded Forum" on February 9, 2012 at Yong Le Banquet Room (B3) in The Westin Taipei. The Forum invited key firms in the supply chain to discuss topics such as industrial control and automation, consumer electronics, and Internet of Things applications of embedded technology. In addition, the topics covered offered participants the opportunity to share their perspective on developments of embedded technology, the market, and application trends.AMD has been cooperating with Digitimes on the Embedded Forum for the past six years, said Andy Tseng, vice president and Taiwan general manager of AMD Corporate. Tseng noted the technology has been showing significant growth throughout the years.Tseng added "the range of applications for one chipset is like observing the transformation with the only limit being human imagination." This ability has created a massive market opportunity for embedded devices and growth in the market will likely reach 10% every year over the next few years. Tseng stated, "For firms, this is a rare opportunity especially when the economy of ICT products has been experiencing a downturn."AMD introduced the AMD Fusion APU series in 2011, which the platform combining a logic processing CPU with a graphic calculating GPU into a single chip. "Demand for high-speed network information transfer and interaction with a graphic interface has been increasing in the embedded platform device environment. Hence the heterogeneous multi-core integration of CPU and GPU has becoming a standard in the industry" said Tseng. Since its introduction to the market, shipments of AMD Fusion APU products have reached 30 million units in less than a year. This has attracted many firms to follow suit into development of the best method of integrating a CPU and GPU.AMD's strategy for the embedded market in 2012 will be to continue developing low power-loss solutions that satisfy users' need for products that are smaller, more active, more dynamic in content, and have a longer operating time and standby time. AMD plans to combine cloud computing structure to provide more service to users, according to Tseng. "The growth of embedded devices has been rapid in newly-developed countries like China, India and Brazil, hence we will invest more resources in these markets" said Tseng.In addition to AMD, the Forum included participants MIPS, Microsoft, InnoDisk, DMP, VIA Labs, Precision Machinery Research Development Center (PMC), Phoenix, OESF, Tektronix, Advantech, Fluendo and Digitimes Research to provide the experience and advice to players in the embedded market.Vice president and Taiwan general manager of AMD Corporate Andy Tseng Photo: DigitimesDTF 2012 Embedded Technology and Application Forum attracts hundreds of participants from the industryPhoto: Digitimes
GIGABYTE TECHNOLOGY Co. Ltd., a leading manufacturer of motherboards and graphics cards showcased its forthcoming 7 series motherboard designs supporting 3rd generation Intel Core processors, showcasing a range of features including the new All Digital Engine, GIGABYTE 3D BIOS and GIGABYTE Ultra Durable 4 technology at CeBIT 2012 (March 6th - March 10th 2012.)GIGABYTE 7 series motherboards - exclusive All Digital EngineVisitors to CeBIT 2012 had an exclusive look at motherboards featuring the latest All Digital Engine design for the PWM. Our new All Digital power design allows users greater control over the power delivered to their 3rd generation Intel Core processors that use the LGA 1155 socket. Using entirely digital controllers for the CPU, processor graphics, VTT and system memory, users can enjoy more precise power delivery to the PC's energy sensitive components than previously possible."GIGABYTE 7 series motherboards featuring a new digital PWM design and an updated version of our popular 3D BIOS are on schedule and will be globally available in volume at launch," commented Henry Kao, Vice President of GIGABYTE Motherboard Business Unit. "Our customers are very excited about making a swift transition to these 7 series motherboards and the benefits they offer in terms of features, performance and control."Dual UEFI with exclusive 3D BIOSGIGABYTE also offered the chance to see for the first time in public, the updated look and feel of GIGABYTE's revolutionary 3D BIOS based on the exclusive GIGABYTE UEFI DualBIOS technology. 3D BIOS offers two distinct modes of interaction in a BIOS environment, 3D Mode and Advanced Mode, that re-draw the traditional BIOS user experience with a far more intuitive and graphical interface.GIGABYTE Ultra Durable 4 – insist on an Ultra Durable motherboardGIGABYTE Ultra Durable 4 motherboards embrace a range of exclusive technologies that guarantee DIY PC builders the absolute best protection for their PC, with built-in features that prevent common malfunction threats including humidity and moisture, electro-static discharge, sudden power loss and high operating temperatures.Set your desktop free: GIGABYTE Bluetooth 4.0/dual band 300Mbps Wi-Fi PCIe cardGIGABYTE 7 series motherboards come with an exclusive PCIe expansion card that offers support for the latest Bluetooth 4.0 (Smart Ready) and dual band 300Mbps Wi-Fi connectivity*. With the growing availability of affordable or free remote PC operation software, such as Splashtop and VLC Remote, GIGABYTE believes that now is the time to explore and enjoy the home cloud: a personal cloud within the secure environment of a home network where the performance and functionality of desktop PCs can be utilized and controlled by portable cloud devices.Onboard mSATA supportGIGABYTE 7 series motherboards feature an onboard mSATA connector* that, together with GIGABYTE's EZ Smart Response technology, allows users to simply and cost effectively enjoy better responsiveness from their PCs. mSATA solid state drives have been made popular by the rapid growth of tablet PCs, and provide a cheaper solution for smart caching because they are available in smaller capacities than traditional SSDs.Small business solutionsSetting its sights on the small business market, GIGABYTE demonstrated the B75M-D3H motherboard that represents a new product range with useful PC management features from both Intel and GIGABYTE. These features allow system integrators to add value to their products by offering services to small businesses with between 1 and 99 PCs, where manageability, affordability and GIGABYTE's Ultra Durable 4 design quality can lead to significant savings in time, cost and effort.GIGABYTE AMD A75 and 900 series motherboardsGIGABYTE also showcased a number of the latest motherboards designed to get the most from both AMD Fusion APUs and Bulldozer CPUs; these include the 990FXA-UD5, 970-UD3, A75-D3H and A75M-UD2H motherboards.*Features may vary by model.GIGABYTE showcased 7 series motherboards at Cebit 2012
The electricity sector in India supplies the world's 5th largest energy consumer accounting for 4.0% of global energy consumption by more than 17% of global population. To satisfy the needs of rising energy consumption - but also to solve electricity wastage, India has identified that during typical transmission and distribution between 30 to 45% electricity is lost, and modernizing these systems to be more energy efficient is a key task to secure future economic growth. These losses are due to inefficient metering, proper energy accounting and auditing, but with the use of smart grid technology, India's power generation industries have a way to overcome these power shortage issues.Smart Grids - The Way ForwardIn order to efficiently deliver reliable, economical ,and sustainable electricity services, the Government of India has been moving the transmission and distribution of electricity to a new smart grid model. Smart grids attempt to predict and intelligently respond to the behavior and actions of all electric power users connected to it - both suppliers and consumers.Transmission, distribution and electricity usage are reported real-time via smart controls, and a smart grid can respond instantaneously to provide just enough energy as required, as well as being able to monitor, regulate and maintain itself.This smarter process is another example of the intelligent use of technology that brings greater efficiency to the grid and allows consumers to actively participate in optimizing the system as well providing information and choice of supply.Smart Grids in IndiaFor electricity transmission reforms, Powergrid (Power Grid Corporation of India Ltd) needed to make plans for keeping pace with the increasing complexities of grid operation in a dynamically changing electricity market, by continuously upgrading load dispatch centers through Wide Area Monitoring, Adoptive Islanding, Voltage Security Assessments, and Dynamic Security Assessments.For electricity distribution reforms, the Cabinet Committee on Economic Affairs (CCEA) approved a "Re-structured APDRP" project scheme. This program included projects for establishment of baseline data and IT applications for energy accounting/auditing & IT based consumer service centers, and included regular distribution strengthening projects. There were four goals needed for Smart Grids in India to succeed: 1. The end of load shedding (peak load shifting through combination of direct control and differential pricing (demand response/dynamic DSM). 2. Reliable power. 3. Cheaper prices. 4. Moresustainable power.Smart Substation SolutionTechnics, New Delhi, an India based channel partner of Advantech who delivers industrial automation solutions for the power & energy industry, designed a Data Concentrator solution for a Government of India R-APDRP substation project. The project was in two stages, phase I and phase II. For phase I, Technics used Advantech's ADAM-5550CE Programmable Automation Controllers with ADAM-5017 (Analog Input), ADAM-5053S and ADAM-5057S (High Density DI/O) I/O modules and Technics own control software. The Data Concentrator solution needed to be able to transfer input data to a control center and accept commands from the control center to control the switchgear at the substation. To satisfy Technics' requirement, Advantech (PAC division)created special customized solid state memory for their needs. Technics then had to integrate the whole system with the customer's SCADA system.Mr Abhay Tandon, CEO of Technics said, "Integration and communication with the end-user's SCADA system was a major challenge. It took our team 800 man days to overcome design challenges and successfully implement the desired solution, but joint efforts put in by Advantech and Technics design teams saw us through. We chose to work on this project with Advantech because of their wide-range product portfolio as well as vertical market domain know how.""Power & Energy customers also require long lifetime product support, so Advantech was the natural choice having been in the automation business for over 25 years. They could supply our customer with the longevity guarantees they needed. Technics software with Advantech's ADAM-5000 series controllers provided the smart solution our customer needed," said Mr Tandon.Technics Software and Advanetch's Industrial Controllers Improve Power Generation Efficiency