Bob's Adventures in Wireless and Video Headline Animator

Thursday, February 26, 2015

There is Nothing Neutral in Title II Regulation of the Internet

It has been a long time since I posted something to my blog. But Title II regulation of the Internet in the guise of implementing Net Neutrality is such as big issue I just had to write something.

A Little Background

I started working with FidoNet back in about 1979 in high school and was enamored with CompuServe throughout college. I joined the Well and used my Hayes 1200 baud modem to connect to others to discuss technology, social issues and the like. But I knew that there was going to be a commercial aspect to this that extended beyond just being a substitute for telephone party lines. 
In about 1988 I started a TCP/IP based BBS system in Oakland I called TransTech. Over time I hooked up to NSFNET. I was given a class C network (which I would later have to give back). And finally I found my calling when I built one of the first wireless ISPs in the Bay Area providing coverage off the top of the 1200 Broadway building in Oakland to the newly decommissioned Alameda Naval Air Station.

I never once had to get a license, ask permission, adhere to standards of decency, or worry about what my customers were using their connectivity for. I provided a pipe through which data flowed. I paid upstream providers for larger pipes, and I never had time to worry about metering content. I knew that if I did meter content, there were lots of competitors ready to take my place. Pricing was set by the market. Service was set by the market. Customers chose the best service for the best price.

Fast forward to present. Pretty much the model of operation for the Internet remains the same. 
However, fear of something that never even has been a real issue prompted cries for government intervention in order to enforce Net Neutrality. But people forget that we live in a Laissez-faire economic system in which transactions between private parties are relatively free from government interference such as these massive Title II regulations. The market will punish those that filter traffic through competition. 

Wireless and over the top/virtual ISPs exist as a locally owned bypass around monopolistic cable and telco companies. This keeps them in check. But not for much longer. We have now empowered the government to regulate from top to bottom a utility which was functioning perfectly well with no regulation. Beware unintended consequences.

Net neutrality and Title II FCC Internet regulation are not the same.

Net neutrality is the concept of treating all packets equally, regardless of the source or the content. But from a practical perspective this is not realistic. There is limited bandwidth on networks. Traffic needs to be managed, prioritized in order for it to all get through. Every network manager will tell you that they apply QoS (quality of service) prioritization on their network. You have to, unless you have limitless capacity. Network operators should be free to define these rules. And customers should be free to choose which networks they want to use based on the quality of service they receive. Networks will evolve based on the type of QoS prioritization they apply. Customers will have choice in a free market.

Net neutrality can and is achieved through competition. A free and open market is what is needed. Maybe there could be some work on local right of ways in order to spur competition. But Title II is like putting out a match with a fire hose.

Title II is an antiquated set of government restrictions designed for the bygone telephone monopoly of 1937. Title II sets up regulatory bureaucracies overseeing a utility that wants to be free, not controlled. 
The first casualties of this regulation will be the local ISPs who provide the very competition to the large monopolies who the advocates of Title II were trying to reign in. So unintended consequence #1 is elimination of competition and choice.

The second unintended consequence will be taxation. Now that the government has control of the Internet it will want to wring out of it every last penny it can. Expect fees, licenses, use taxes, excise taxes, and the like to start appearing on your ISP bill shortly. And they will grow.

The third unintended consequence will be increases in the barriers to entry for using the Internet as a vehicle for free expression. With no license fees, no bureaucracy, no regulation, today anyone can set up a blog, a web site, or chat at a cost approaching zero. But the government will make it increasingly harder for people to do this.

The fourth unintended consequence will be censorship. The fairness doctrine. The seven deadly words. Hate speech, and other arbitrary rules will be applied to curtail the use of the "public" (read government regulated) Internet. 

Beyond this, I am sure their will be many other unintended consequences. Curtailment of innovation. Decline in infrastructure investment. Who knows.

Pirate Internet

There is one thing everyone can start planning to do. Buy an outdoor wireless router. Load it up with some mesh software like Freifunk Wireless, and create a local community network which is apart from the government run Internet. Using currently unlicensed radio frequencies, you can bypass the regulation which is coming. You can maintain your freedom. And you can create competition for the new Ma Bell or whatever we will call the super monopolies which this is sure to produce.

Tuesday, December 3, 2013

Carriers Move to Limit Public Safety Cellular Data Transfer

A customer of ours told us, and this has not yet been confirmed, that Verizon is moving to follow the T-Mobile model of cellular data transfer throttling for public safety. Basically you will pay for a 5GB transfer cap at full LTE 4G speed (measured in Mbps) at a fixed fee, then once you hit the cap you can still transfer but your speed will be reduced to 256kbps or less. I assume that you can still purchase a plan with a higher transfer cap, but it will cost you more.

Actually I use T-Mobile 4G with such a plan for my cell phone and it is a great deal. But then I am only surfing the net, checking my emails and checking in on Facebook.

For police and other public safety personnel who have become accustomed to unlimited transfer at full speed, this presents a big problem. Many of our folks in blue use IP cameras to stream video from a remote site over cellular. It is easy to install, it is available everywhere, and it is compact.

Unfortunately streaming video requires a lot of bandwidth. A typical 720p at 15fps camera will use about 1.7Mbps. That is about 13MB per minute, or 765MB per hour. So you can easily reach your 5GB transfer limit in about 6 1/2 hours. So if you need 24x7 streaming you have to compromise.

Many IP cameras support multiple streams for the same video and some have SD cards that allow up to 64GB of storage (83 hours of 720p at 15fps). So you can send one small stream over cellular at a low resolution and slow frame rate while storing locally. If you were willing to run at VGA resolution (640x480) at 4fps you could get an video stream for "situational awareness" in 150kbps. But that is not a very good image quality, it is slow and jerky, and it still will burn through your data plan at 1MB per minute, or 1.6GB per day. So you could run for about 3 days a month before you run out of data transfer on your plan.

Another issue with this strategy is that you will need to determine what the minimum data rate that your cellular link supports is. If you are in an area with only 3G and are on the fringe of the coverage area or in an area that is highly congested, you may only achieve 100kbps. So you need to set your camera at this low rate in order to assure that your stream gets across. Your camera cannot adapt.

And then you have to think about retrieving your video evidence from your camera. You have your cellular connection, but if you try to transfer the locally stored high quality video from the camera back over that cell connection, you are again faced with your data cap.

And what if you want a 1080p camera or a 5MP camera, or a 20MP camera, or faster frame rate. What will you do?


What you really need is a video storage and transmission system that was designed for the limitations of a cellular world.

  • A system which allows you to use the camera you want, at the location you want, regardless of what kind of power is available or how much space there is.
  • A system which can store weeks or even months worth of full frame rate, full resolution video on site and yet allow you to easily retrieve it over the air.
  • A system that can accept any USB modem from any carrier.
  • A system which maintains an outbound connection to your HQ location, but does not send video when no one is watching.
  • A system which automatically adjusts it's frame rate, resolution, and compression for the best possible live viewing given the bandwidth available.
  • A system which supports remote viewing using any device you have, be it a pc, iPhone or Android device.
  • A system with wide operating temperature and choice of storage media up to 1TB internally.
  • A system that supports cryptographic tunneling for security.
  • A system that gives you a choice of using a simple Windows remote desktop or web interface for management.
  • A system that is backed by a team of experts in IP wireless video.
  • And a system that is reasonably priced.

You want a HauteSpot Networks MVE system.

I could go on about all of the benefits of the MVE system, but the best way to decide for yourself is to try the system out. Call HauteSpot Networks to order a demo system. +1 (800) 541-5589

Thursday, May 16, 2013

The Power Paradox

Three years ago HauteSpot Networks introduced the microNVR which was and remains a valuable tool for video security. The premise of the microNVR is to be a small (as in "micro"), power efficient, computing device that has all sorts of connectivity and the ability to store and process video at the edge. In this case the edge means on a pole, in a car, along a fence, on a roof, out in the woods...anywhere you just cannot place a big computer.

For some time now we have been working towards a upgrade replacement for the microNVR. We need to increase it's video processing capability and its storage, while retaining the same general low power consumption and size.

Intel left us without a good roadmap. The Atom processor family has basically gone without an upgrade since 2009, while Intel focused its resources on building their Mobile Phone System Platforms, they let the Atom family languish. And in the Core product line there really were no viable options presented either. Most of their chips started at 25 watts or about 10x the power consumption of the Atom.

The Z550 Atom that we use in the microNVR runs at 2GHz and is a single core processor with two threads. The GMA500 GPU is capable of encoding about 30fps of 1080p video into H.264 streams. If you have 3 two megapixel cameras running at 10fps each, and are encoding the video or transcoding it for transmission, then you have pretty much consumed all of the capacity of the processor and GPU.

CPU Boss Shootout review...
I have no idea where they got their power number.
The Atom is 1/10 the power of the Fusion.
So we went over to AMD and experimented with their Fusion product family, which was presented as having great video processing capabilities with low power. What we found was that AMD had no better story than Intel. Their embedded processors started at about 17 watts and their GPU was designed for better performance, at least in theory, than the Intel Atom. The AMD T40N processor, which runs at 1GHz does have a better GPU for decode (like watching a movie). But that GPU only decodes certain codecs well. H.264 encode requires third party libraries (from a third party called Main Concept) that when we tried them did not work. We contacted AMD and they said that the GPU did not support H.264 encode but that later versions of the Fusion family would. So that meant that any video processing on the platform other than decode had to be done on the CPU in software, which, because the CPU was clocked at only 1 GHz was even worse than the ATOM.

So we waited, and waited, and waited. Finally Intel released a low power 3rd generation i3/i5/i7 processor family. The really good news is that these processors have Quick Sync technology built into them as part of the HD Graphics 4000 GPU. Quick Sync accelerates both encode and decode of H.264 (for applications that are designed to use it).

So we were excited to take a look. There is a lot of supporting documentation regarding the performance that Quick Sync brings to H.264. We expect that it will mean about 3x performance over previous generations of Intel Core processors. Or about 98fps of 1080p H.264 transcode with almost any member of the i3/i5/i7 3rd generation family (Ivy Bridge). However, running at 17W means that our system will consume somewhere around 23W versus the 10W that we consume today with the Atom system. This much power means twice as many solar panels, twice as many batteries, etc.

Then we got some even better news that Intel was going to release a new Atom processor in the second quarter of 2013 and that this new Atom (Bay Trail) would have Quick Sync in it and have the same HD 4000 GPU. Great, this is the CPU we want...well don't get too excited...

CPU World reported that Bay Trail was going to be delayed until mid 2014. This would have been a 3W chip that had all of the performance we wanted in the exact package size we need. But who knows when or if this will get released.

So we are going to punt. This probably means building a system for release in the next quarter based on the low power i3 or i7 processor. The system will be a little bit bigger and take more power. Then when the new low power Bay Trail comes out, we will jump back to that.

Unless Intel jumps around some more. Keep your fingers crossed.

Sunday, April 14, 2013

Getting through the hype at ISC West

It was interesting to walk the ISC West 2013 show floor and speak with vendors. There were two wild claims being made at this years show and misrepresentations of what vendors were showing that I think integrators need to be aware of.

Video Streaming Over 4G

Example of a matrix view of thumbnails on a VMS
The most outlandish claim came from one VMS vendor who was saying that their product could transmit 16 megapixel camera streams at 30fps and full resolution in 1Mbps of bandwidth over 3G/4G. This was just a flat our misrepresentation of what they were doing.

There is a huge difference between transporting 16 streams as full frame rate and full resolution and sending thumbnail views of camera streams. Let me explain.

A typical H.264 IP camera with a 1.3 megapixel resolution will compress and stream video in 25.6KB frame size. At 30fps, that is 768KBps or 6144Kbps or 6.1Mbps. This assumes some motion, etc, but is a general ballpark. By turning up compression, you can get this down to around 2.5Mbps at the cost of image quality (artifacts).

In a typical security configuration the cameras send a primary stream over a high speed local area network to the VMS server without concern for bandwidth, recording at best quality and frame rate. Sending a 2.5 Mbps, or even a 6Mbps stream, between the camera and the server is not usually an issue. However, sending from the server out to a remote client such as a smartphone over a narrow cellular link can be an issue.

Example of how HauteSpot MVE Connects
If you are using a LTE phone you may be lucky and get 8 Mbps or more down, depending on network conditions, but rarely is this sustainable over long periods. A more likely down stream speed is going to be around 1 to 2 Mbps.

If you tried to stream at full resolution, full frame rate and full image quality even one H.264 1.3 megapixel stream the video will have little chance of getting through in real time. You will need to buffer a long time (aka YouTube, Hulu, etc). Remember that most broadcast streaming is non-realtime and they use multipass encoding techniques to improve the quality and reduce the size of the streams they send.

Further, a 1080p monitor can display 1920x1080 pixels or approximately 2 megapixels at any given time. So sending any more data than this from the server to your client is really useless.

So what most VMS systems do is set a second stream from the camera to a lower resolution for live viewing. They are still recording the primary live stream at full resolution. When someone goes to the VMS from their cell phone, they first get a thumbnail of each of the cameras using the small substream and then, when you click on a camera to look at in detail, the VMS will shift from the small secondary stream to the primary full resolution stream. But even this may be scaled and transcoded on the server to fit the size of the screen of your device.

So, really what you are getting from the VMS to the smartphone client is a 2 megapixel or smaller transcoded rendering of your cameras. When you are looking at 16 cameras, you are not receiving all 16 streams at full resolution, you are receiving either a bunch of small substreams or a transcoded single stream representing all of the substreams in 2 megapixels. This may then additionally be compressed to fit into a 1 Mbps pipe. Think of it as a "image of your streams".

There are many VMS who do this and the technique has existed for years. It is a commonly known method that is well documented in the public domain.

What is missing from most vendors implementations are several critical elements which are addressed by the HauteSpot MVE system:
  • Inbound Mobile Ingest of Cameras - It is all well and good to be able to view your cameras remotely over 4G, but connecting your cameras to the VMS over cellular is a different story. This requires uplink speed, which rarely exceeds 500kbps and can be extremely variable. MVE dynamically adjusts the uplink speed to fit the available bandwidth. It is not preset to a low "live view" rate and size. If you have good bandwidth, you will get a better image.
  • Persistence - Most VMS have no ability to deal with the constant disconnects of cellular. Roaming between cell towers, changing IP addresses, and lost signal all contribute to loss of connection. MVE makes and keeps persistent connections that have been proven to work on cellular networks in the worst conditions (hurricane Irene, tropical storm Sandy, floods and hot weather).
  • Remote Record with Transfer - The HauteSpot microNVR allows remote recording at full frame rate and resolution at the camera source. It then provides a persistent connection to the MVE server and the dynamic rate adaptation which adjusts to changing available bandwidth. It also allows remote, chain of evidence transfer of the high resolution video from the remote site in batch.
There is a big difference between transmitting full resolution, full frame rate, full quality video over cellular and sending a transcoded or down-res version of video from megapixel sources. Know what you are getting.

VMS IP Video Decode Capacity Vs Record Capacity

As we walked the show we asked a simple question of a number of VMS vendors "How many 1080p 30fps live streams can you simultaneously display (assuming that they had a multiple monitor video card on their system)?" The answers ranged from 4 to 64. And the answers were coming from field application engineers, not sales people.

A VMS provides a number functions including recording video streams to disk, displaying live and recorded streams, providing search capabilities, etc. The answers that we got indicated that the vendors really did not understand the difference between storing video and decoding video for display.

Again, IP VMS systems receive incoming video streams from cameras. These are typically either H.264 or MJPEG streams that have a fixed frame rate, resolution and compression profile as set on the camera. Most VMS systems will receive the video stream and write it to disk. This function does not require any significant CPU or GPU functions, since it is really just network IO. The system is limited only by the bandwidth of the network as to how many streams it can support. Disk IO is generally faster than network IO.

Tom's Hardware Benchmark for H.264 encoding
Displaying a live or recorded stream does require CPU and/or GPU resources. A stream must be read and decoded for display. This is significant work for a computer.

The chart on the left shows just how much time it takes to encode 1080i video using hardware and software. This is an example of when you are reading a raw source and sending it to a client to view. Performance is similar for decoding. As you can see, using hardware such as the Intel Quick Sync technology built into Ivy Bridge based systems can significantly improve performance. Without Quick Sync, software encoding/decoding consumes approximately 25-50% of a quad core CPU for just one 1080p 30fps stream. Using high end video cards helps some, but again, they are targeted at single or maybe dual display systems, so they really don't try to decode more than what they are capable of displaying.

If you were to use the hardware acceleration of Quick Sync, then you could get 120 fps of 1080p
video encoded or decoded for playback. So the best possible case for displaying full 1080p at 30fps is 4 simultaneous streams. But this is using hardware. If you are using software to decode/encode, as most VMS systems do, then you are really down to about 1 stream. Which makes sense given that most VMS systems only have one display attached.

So the vendors that said they are able to simultaneously decode 12 or 64 streams are clearly not correct.

If you are writing a file to disk or reading a file from disk to and from the network, then yes, you can support many cameras. If you are trying to actually display the video, then the capacity is much much lower.

It is interesting that only a couple of products like Network Optix HD Witness or HauteSpot's MVE system take advantage of Intel Quick Sync hardware acceleration in order to increase performance and are designed for high definition video processing.

This is actually a good thing for companies like RGB Spectrum who make display wall processors that allow you to aggregate multiple server video outputs into a single consolidated display. Solutions like their QuadView HDx allow you to consolidate multiple video sources into one screen, so even if you can only get a single 1080p stream out of your VMS, you can combine it with other servers to create faster, full frame rate, full resolution systems. Watch for more on high resolution display, particularly 4K, when I review NAB 2013 in my next blog.

Sunday, April 7, 2013

Last Chance for Free Exhibits Pass to ISC West 2013

It is that time of year again...ISC West 2013 in Las Vegas. If you have not already done so, there is still time to get your exhibits only pass. Visit our free exhibits only pass page by April 9th. Then come by our booth 9139 to see some of our cool new products including:

HauteMOBiLE - In-vehicle high performance multi technology multi function router 


HauteMESH - High availability wireless routing


Integrated 8MP 180 degree camera - NVR - wireless router 

HauteSpot and Sentry360 have collaborated to create a new, completely integrated remote surveillance solution which combines a 8MP 180 degree camera, the microNVR, and the HauteMOBiLE router into a single, easy to install, low power, rugged system. The new system allows situational awareness video surveillance to be placed anywhere and powered by just about anything (solar, wind, battery, ac). Video can be viewed live, in high frame rate and high resolution over HauteSpot private broadband or over 3G or 4G cellular. The system scales to hundreds of cameras. HauteSpot will be presenting this new product at the Sentry360 Partner Conference on April 9th at the Flamingo Hotel.

Collaborate and WIN

Come by the HauteSpot booth 9139 and meet all of the HauteSpot Team, discuss your upcoming projects.

We also are running a show special through the end of April on the HauteMOBiLE router. Only $299. See our booth for details and applicable limitations.

We are also raffling off products. Come by and say hi.

Friday, February 8, 2013

Standard vs Embedded OS for Video Server

Video Management Servers are interesting beasts. On the one hand security system installers, integrators and end users want the flexibility of running a standard operating system where they can install the applications they want, optimize the system the way that they want and use it in a way that is well known to them.

On the other hand they also want to lock the system down so that it cannot be easily broken/trashed/corrupted/hacked etc.

Building an embedded system is generally what people do to lock a system down to serve one purpose. We sometimes call this building an appliance. Basically an embedded system will use a purpose built piece of hardware and special purpose operating system, install their application, and then freeze the system so it cannot be modified.

If you are a system developer, this is a fairly straight forward endeavor. You assign a programmer to build the embedded operating system and use programming tools to lock it down. Then you sell your appliance as a single function device. Done. Linux is well designed for this and you can easily build systems with read only boot/system partitions and then working directory structures to store system data. Microsoft offers a product line for this which is referred to as Windows Embedded. It is a modification of standard windows which allows modular configuration of the operating system, customization of the interface, and locking the system down.

The problem that embedded systems pose for end users and system installers is that they limit flexibility. Once a system is built into an embedded appliance, it's purpose is set. You can't easily modify it, add functions to it, or customize it. From a support perspective this is good. Support technicians know exactly what the system does, how it does it, how to reset it, etc. So when a customer calls in, you can actually help them. From a user perspective it is frustrating, as you may need to add functionality that the embedded system developer never intended or imagined.

The other main thing that embedded systems do is create read only boot/system images that allow devices to be powered off without going through a shutdown sequence. Generally the OS and applications are loaded into and run from RAM on boot up and read/write to a RAM disk. This eliminates disk writes which can corrupt storage if they are interrupted. A standard operating system on the other hand will read and write to the physical disk, freeing up RAM to be used for other purposes. There are a lot of things that written to disk including page files, log files, temp files, registry, configuration files, etc.

When we designed the microNVR we weighed the pros and cons of using an embedded operating system versus using a standard operating system. We opted for supporting Windows 7 Professional which is a standard operating system (we also support our own distribution of Linux which is based on the base Kernel with an LXDE desktop). This allows our customers to install and run any standard applications they wish. For example we support Exacq, Milestone, ONSSI, Genetec, Network Optix, and many other video management systems. We also support our own MVE and Mobile Video Vault systems.

However, we have had some issues with customers not executing safe system shutdowns on Windows, which has resulted in disk corruption. Normally on boot up the user will be asked to run a disk check and correct the issue, but the microNVR is often used headless (without a monitor) so the user cannot see that the system wants him/her to hit a key to run a repair. We had to make a change.

We have just finished building a new system image which will be available on new microNVR systems. The new image still uses Windows 7 Professional, but with several significant changes which make the system straddle the line between embedded and standard operating systems. But the change also fixes a number of deficiencies we saw in Windows 7 Professional when used as a host for a Video Management System.

First we changed the definition of how we used our disk partitions. We now have a system partition (C:) and a data partition (D:). Drive C: loads normally to memory, but we also create a RAM drive which intercepts write activity and buffers it while the system is powered on, making Drive C: essentially read only. Drive D: is a full read/write partition which hosts all of your applications, as well as user data files, log files, page files and more. Anything that needs to be persistently written to disk goes on Drive D:.

When a user wants to modify the registry, or other system configuration which needs to be written persistently to Drive C: for booting, then we have an application on the system desktop that will write the working RAM drive intercept partition to Drive C: making the changes permanent.

So basically we created a hybrid system that protects the system from unintended changes, perserves data, and allows permanent changes to the system without having to use an embedded operating system. We reduce system failures due to disk errors and allow the microNVR to behave like a true embedded device, while allowing all of the flexibility of a standard server OS. Think of this as an end user oriented embedded system.

The only thing a user has to do differently with our new Windows 7 Professional build is click the "Write to Disk C:" icon on their desktop after they have installed or remove an application or make other system changes which require modifications to registry or the system disk partition. Of course the whole system can be remotely administered using Windows Remote Desktop using Ethernet or 802.11b/g/n wireless connections.

The new hybrid OS is an exciting development by HauteSpot which gives our installers, system integrators, and end users all the benefits of an embedded OS, with the flexibility of a standard OS.

Friday, January 25, 2013

HauteSpot and GER Support the 57th Presidential Inauguration

900,000 People Fill the Capital Mall for the 57th Presidential Inauguration

DC DOH Mobile Command Vehicle with Ewrap on right
The 2013 Presidential Inauguration brought landmark changes in emergency management and spectator safety.  For the first time, inaugural personnel used a powerful situational awareness software suite and wireless communications system to track medical emergencies; reunite lost family members; and provide real time information to event organizers.  Emergency personnel from The District of Columbia, Maryland, Virginia, and the United States military integrated emergency data using HC Standard® – a patient tracking and critical asset software solution developed by Global EmergencyResources, LLC based in Augusta, Georgia. The mobile systems and devices were enabled to communicate over various methods including the Ewrap (Emergency Wireless Routing Access Point) system developed by HauteSpot Networks for Global Emergency Resources.

One of several first aid tents erected during the Inauguration
HC Standard® and the Ewrap system allowed local, state and federal agencies, including the National Parks Service, Secret Service, the Red Cross, and Homeland Security officials to have a common operating picture of major events during the Inauguration, including the Presidential Candlelight Reception; the Inaugural Parade; activities along the National Mall; the Commander in Chief Ball; the Inaugural Ball; and the Inaugural Prayer Service.

Staff from JTF CAPMED prepare for patient support
The DC Department of Health partnered with the MarylandInstitute for Emergency Medical Service Systems (MIEMSS), the Northern Virginia Emergency Response System (NVERS), and the Maryland Department of HumanResources (MD DHS) to provide patient care and tracking throughout the event.  Each partner used its own installation of HC Standard® to enter patient data with Motorola MC65 handheld devices. The MC65 handhelds then linked over either cellular or standard 802.11b/g/n "WiFi" through one of 18 Ewrap communications systems strategically placed at tents around the Capital Mall back to command centers in real time.

MC65 handheld with HC Standard® running
The Ewrap system is a waterproof, rugged, portable communication system which combines advanced wireless mesh, multi band broadband connectivity, high speed wired Ethernet, and multi technology/multi band cellular (LTE, EVDO, UMTA, HSPA+, and GSM EDGE) wireless backhaul to maintain constant, always on communications, along with long range 802.11b/g/n access point function for client device connections. The electronics are based on HauteSpot's HauteMESH router product, but enhanced and integrated into the case, antenna, and power system by GER. Each Ewrap is fully self contained with batteries capable of running for several days, yet the system is light weight and easy to carry by hand. The Ewrap is intelligent and automatically selects the best backhaul connection.

Ewrap systems ready for use
Around the Mall the Ewrap units would communicate to each other over mesh. Then each Ewrap would connect over the carrier cellular network, announcing it's cellular status over mesh to all the other Ewraps. If an Ewrap could not connect over its local cellular connection, it could use the connection of another Ewrap on the mesh.

Patient and incident management data was aggregated and shared in all the different agency's systems so that EMTs, first responders, and command center leaders could see the full picture of Inaugural events as they occurred.

JTF CapMed Operations Team
on the day of the 57th Inauguration.
During the Inauguration, volunteers from the various EMS agencies used the Motorola Handled devices running HC Standard® to track emergency or first aid case, then transmitted the data back in real to each of the three emergency operations centers where the data was plotted, displayed, and used for the event tracking and management.  Additionally, family members who were lost, and those who were looking for them, had their information uploaded in real time to a multi-jurisdictional database so they could be more easily reunited.  Even the 100+ horses that carried the mounted police were part of the HC Standard® operating picture. 

“Interoperability was key,” says Stan Kuzia, CEO and founder of Global Emergency Resources.  “The EMS and Healthcare partners in the National Capital Region (NCR) have worked diligently over the years to eliminate information silos and enhance communication. This Presidential Inauguration demonstrated their hard work is paying off”.  The various civilian agencies in the NCR  worked closely with their military counterparts to share a combined picture of patients and missing persons being treated and handled during the entire event.  HC Standard® and the Ewrap communications system helped to bridge the interoperability gaps on Inauguration Day as near real-time data was made available to military responders just as quickly as it was to their civilian counterparts.