Acciones - Quejas
  • Register

NASA describes expected impact of total eclipse on GPS

NASA has issued a statement to let the GPS community know what to expect when the total solar eclipse takes place across America on Aug. 21.

On Aug. 21, the eclipse will cross all of North America. Anyone within the path of totality will see the moon completely cover the sun, and the sun’s tenuous atmosphere — the corona — can be seen.

Observers outside this path will still see a partial solar eclipse where the moon covers part of the sun’s disk.

usa eclipse map W 300x150

A map of the United States showing the path of totality for the August 21, 2017 total solar eclipse. (Image: NASA)

For NASA, the eclipse provides a unique opportunity to study the sun, Earth, moon and their interaction because of the eclipse’s long path over land and coast to coast. Eleven NASA and NOAA satellites, as well as the International Space Station, more than 50 high-altitude balloons and hundreds of ground-based assets, will take advantage of this rare event over 90 minutes, sharing the science and the beauty of a total solar eclipse with all.

Via live streams and a NASA TV broadcast, NASA will bring the Aug. 21 eclipse live to viewers everywhere in the world.

Below is the statement from NASA regarding GPS.


NASA Note on the Aug. 21 Solar Eclipse and Its Effect on GPS Users

FOR THE GPS COMMUNITY

From ionospheric point of view, the expected effect of solar eclipse is a significant reduction in solar EUV ionization (solar EUV radiation is blocked) and thus in the amount of ionospheric total electron content (TEC) with respect to nominal conditions along the eclipse path.

Some observations also show wave-like TEC perturbations in small magnitude (~1 TECU) during eclipse as shown in the attached reference. The wave-like perturbations appear to be the effect of atmospheric gravity waves or traveling ionospheric disturbances (TIDs) that might be triggered during eclipse.

The TEC decrease would reduce ionospheric-induced delay of GPS signals. The small-magnitude TIDs won’t cause any major effects on GPS signals. These should not cause loss of GPS signals.

I have not seen any reports about ionospheric scintillation observations during eclipse (I might have missed them). It would be interesting to analyze GPS data along the path of upcoming August eclipse to see if any scintillation events could be triggered.

We have some GPS data processing tools at JPL and can contribute to this analysis.

FOR THE GENERAL PUBLIC

A solar eclipse occurs when the Moon passes between the Sun and the Earth, thereby totally or partly obscuring the image of the sun for a viewer on Earth. There is a region of Earth’s upper atmosphere, called the ionosphere which affects radio waves, including GPS.

The ionosphere consists of “ions,” a shell of electrons and electrically charged atoms and molecules. Because ions are created through sunlight interacting with the atoms and molecules in the very thin upper atmosphere, the density (thickness and consistency) of the ionosphere varies from day to night.

The ionosphere bends radio signals, similar to the way water will bend light signals. That is why you can hear AM radio broadcasts from far away at night. Also, ham radio operators rely on the ionosphere to bounce their signals from their station to the far reaches of the globe.

Since GPS is a radio signal, its measurements are slightly impacted by ionosphere changes, resulting in small increases in position error. For all except very precise GPS users, these changes are negligible.

Note that a total eclipse of the Sun is similar to our day-night cycle, only much faster. So, while the ionosphere will be more dynamic during an eclipse, it will not cause a loss of the GPS signal.

In summary, while any effects from the eclipse are of scientific interest, GPS service should not be adversely affected by the Aug. 21 solar eclipse.

Ionospheric effects should not be confused with those from solar flares (a brief eruption of intense high-energy radiation from the sun’s surface) that can cause significant electromagnetic disturbances on the earth, impacting radio frequency communications/transmissions (including GPS signals) and power line transmissions. Solar flares are not produced because of an eclipse.

NASA has funded 11 studies in a range of heliophysics disciplines; work at MIT Haystack Observatory and Virginia Tech will make extensive use of GPS receivers to study the effects of the total eclipse on the Earth’s ionosphere.

(NASA acknowledges the expertise of Larry Young and Xiaoqing Pi of NASA’s Jet Propulsion Laboratory for content, and AJ Oria of Overlook Systems Technologies for the coordination and editing of these statements.)

http://ABC.es - Miércoles 23 de Agosto del 2017

Un eclipse solar, el fenómeno que puso fin a una batalla y que confirmó a Einstein

Un eclipse solar total recorrerá el cielo de Estados Unidos esta misma tarde (en directo, aquí). El disco solar quedará parcialmente oculto por la Luna en América del Norte, Europa y América del Sur, pero además, en una estrecha franja de terreno de EE.UU., la Luna ocultará por completo a nuestra estrella. Unos 100 millones de personas podrán verlo en directo, y sentir en sus carnes la inquietante sensación: a plena luz del día, el cielo se oscurecerá de repente casi como si fuera de noche, la temperatura caerá unos cinco o seis grados y los pájaros callarán. (Si no has podido viajar a Estados Unidos puedes verlo en directo a partir de las siete y cuarto de la tarde aquí).

Ahora vivimos en la era de la información y sabemos que los fenómenos naturales tienen una explicación racional. Pero en pleno siglo VI antes de Cristo, un evento así podría resultar aterrador. Hasta tal punto que podía poner fin a una batalla.

 

Según el historiador Heredoto, esto ocurrió cuando las huestes medas y lidias, capitaneadas por los rey Alyattes y Cyaxares, respectivamente, se preparaban para luchar en una batalla decisiva, hoy conocida como batalla del eclipse. Se cree que la contienda se iba a librar al borde del río Halys, hoy llamado Kizilirmak, en el centro de la península de Anatolia, cuando el cielo «se apagó» de forma imprevista.

batalla eclipse kJBB U21854046734kqE 510x320abc
Localización del río y los reinos en conflicto- WIKIPEDIA

El fenómeno, predicho supuestamente por Tales de Mileto, provocó que el día se oscureciera y se convirtiera en noche, por lo que los ejércitos decidieron detener la lucha y negociar una tregua.

Hoy en día se desconoce la localización exacta de la batalla, pero según los cálculos de los astrónomos los hechos podrían estar relacionados con un eclipse ocurrido el 28 de mayo del año 585.

 

http://ABC.es - Miércoles 23 de Agosto del 2017

Autonomy assembled: Driverless kits to hit the road in 2020

A major new global-scale venture by China’s Internet giant Baidu aims to put artificial intelligence behind the wheel of fully autonomous vehicles on the road by 2020.

Regulatory considerations aside, the technical challenges are considerable, but like its U.S. counterpart Google, Baidu is pushing a big pile of chips onto its artificial intelligence (AI) bet.

Similar to Android, it has made much of the Apollo program’s code, which is completely open-source and available on Github.

The ecosystem, launched at the Baidu developers conference in Beijing in April, has enlisted at least 50 partners worldwide, with more anticipated.

A key participant is AutonomouStuff, which started out as an autonomous components supplier, but lately self-transformed into a full-fledged system integrator, with core GNSS and inertial capabilities drawn from manufacturers in the positioning, navigation and timing (PNT) industry.

Other Apollo partners include major Chinese auto manufacturers; tier 1 suppliers such as Bosch, Continental Automotive and ZF Friedrichshafen AG; components providers such as NVIDIA and Microsoft Cloud; mapper TomTom; and drive-sharing companies.

AutonomouStuff kitted out two standard Lincoln MKZ sedans for demonstration drives at the Beijing conference, with one technician completing each vehicle in about three hours — a task that would normally take a team of workers up to six weeks. The two Lincolns then drove simultaneously, driverless, around a test track.

The technology has been developed to be transferrable to other vehicles. Models already demonstrated include the Ford Fusion, a street-legal golf-cart-type electric vehicle called the Polaris GEM, and an off-road Ranger buggy platform.

AutonomousStuff Apollo Baidu O

AutonomousStuff presents the Apollo kit at the Baidu developer’s conference in April. (Photo: AutonomousStuff)

How It Works

Each car is modified by adding lasers, camera, radar sensors, GPS and inertial measurement unit (IMU), a drive-by-wire computer interface and computer engine.

Laser Sensors. A 64-beam lidar sensor on the roof gives a 360-degree field of vision for mapping, and lidar localization algorithms drawing on more than 2.2 million points of data per second generate a point cloud giving distance, angle and intensity values. This data is integrated with data from the GPS and IMU to generate a base map. Two smaller lidar sensors on the front corners of the vehicle provide obstacle detection and tracking.

Rotating four-beam laser sensors with 110-degree view and 200-meter range cover blind spots and facilitate fusing all raw data into one scan. Together, they detect other cars, trucks, bikes, pedestrians and background objects, and generate detailed data on their position, motion and shape. Distance and angular resolution data are used to offset camera and radar data.

Cameras. The platform uses two visible-light cameras mounted on the windshield, relying on laser sensors for nighttime operation. An image-processing chip provides real-time detection of lanes, vehicles and pedestrians, and measures dynamic distances from the vehicle.

Radar. Five radar sensors provide object detection, with various placements around the vehicle, and varying ranges and fields of view. Jointly, they provide a 360-degree bubble around the car.

Navigation. The kits provide GPS navigation combined with a tightly coupled IMU to provide data when GPS is not available.

Together, this provides accuracy to 2 cm, according to the company, when used with a real-time kinematic (RTK) base station; this obviously limits vehicle range. Another option is to use correction data from satellite-based correction services such as TerraStar, yielding achievable accuracies on the order of 4 cm.

Documentation

The aim of the Apollo project is to enable partners and customers to develop their own self-driving systems. The information supplied by Baidu encompasses a complete set of end-to-end instructions to convert a regular car to an autonomous-driving vehicle:

Software Instructions. A set of files that contain:

  • architecture of the classes and the files within each class.
  • code instructions for:
    • coordinate system
    • third-party libraries
    • calibration table.

Hardware Documents. Instructions to install the hardware and software for the vehicle include:

  • Vehicle:
    • industrial PC (IPC)
    • GPS
    • inertial measurement unit (IMU)
    • controller area network (CAN) card
    • hard drive
    • GPS antenna
    • GPS receiver
  • Software:
    • Ubuntu Linux
    • Apollo Linux kernel
  • Hardware reference guides:
    • vehicle
    • IPC
    • GPS
    • CAN card

Miércoles 09 de Agosto del 2017 - http://gpsworld.com

Expertos piden luchar contra la amenaza de los robots asesinos

Robots asesinos kJ3B 620x349abc

Un grupo de 116 directivos de empresas dedicadas a la Inteligencia Artificial (IA), entre ellos Elon Musk (Fundador de Tesla y Space X), acaban de firmar en Melbourne, donde estos días se celebra la Conferencia Internacional de Inteligencia Artificial 2017, una carta abierta pidiendo la prohibición de robots asesinos, máquinas de guerra autónomas capaces de tomar sus propias decisiones y atacar objetivos militares sin necesidad de intervención o autorización humana.

Los firmantes, de 26 países diferentes, advierten en la carta de una incipiente «tercera revolución tecnológica en la guerra» y solicitan a la ONU que busque «una forma de protegernos a todos de estos peligros».

No es la primera vez que se pide algo parecido. En verano de 2015, miles de investigadores en robótica e IA ya firmaron una carta similar, y en diciembre de 2016, 123 naciones tomaron la decisión, en la sede de Naciones Unidas, de comenzar una ronda de discusiones formales sobre la conveniencia de desarrollar y desplegar este tipo de armamento. Por ahora, 19 de esas naciones ya han pedido formalmente su abolición total.

 

Las AKM («Autonomous Killing Machine», máquinas de matar autónomas) constituyen toda una nueva categoría de armamento, y dan un nuevo paso que va mucho más allá de los drones que se utilizan actualmente para atacar objetivos terroristas o militares en países como Paquistán, Afganistán y otras regiones de conflicto.

Diferentes de los drones

La principal diferencia es que los drones actuales son operados por pilotos desde salas de control remotas, muy lejos del lugar del ataque, lo que implica que, con este tipo de armas, el lanzamiento, o no, de las bombas sigue dependiendo de una decisión humana. Los AKM, por el contrario, no necesitan que ninguna persona de luz verde al bombardeo, ya que pueden tomar la decisión por sí mismos.

Desde un cierto punto de vista, los AKM tienen varias ventajas sobre los sistemas actuales de drones. En primer lugar, es mucho más barato enviar un dron a Paquistán que un avión tripulado. Y si se eliminan también los pilotos y las salas de control remotas los costes podrían reducirse aún mucho más.

Existe también una ventaja política. Estados Unidos, por ejemplo, perdió mucho durante las guerras de Vietnam y Afganistán, en gran parte porque los votantes norteamericanos estaban hartos de ver cómo sus conciudadanos volvían a casa en bolsas de plástico, mutilados de por vida o perjudicados mentalmente, lo que dio lugar a amplios movimientos pacifistas y a cambios políticos y estratégicos. Con los AKM, sin víctimas americanas, la decisión de participar en un conflicto lejano se volvería mucho más fácil para los dirigentes.

El «problema» de la conciencia

Y luego está el «problema» de la conciencia. En los últimos años, en efecto, un gran número de pilotos de drones han renunciado a sus puestos. En parte, debido a la gran carga de trabajo, pero también por el hecho de que les resultaba insoportable ver las imágenes de vídeo de las víctimas incineradas y terriblemente mutiladas por sus misiles. Los AKM, por supuesto, no tienen ete problema.

En resumen, las ventajas emocionales, políticas y financieras, podrían animar las intenciones belicistas de muchos líderes en muchos países. El único elemento en contra del despliegue los AKM sería el factor ético, y todos sabemos que en el mundo abundan dirigentes que no tendrían el menor problema para sucumbir a la tentación de usar esta nueva clase de armas.

Las Máquinas de Matar Autónomas, en efecto, son capaces de operar sin supervisión humana, y cuentan con un cierto grado de capacidad de decisión, incluida la posibilidad de seleccionar sus objetivos. Los firmantes de la carta aseguran que durante los últimos años se han producido avances técnicos sorprendentes en los sistemas de aprendizaje de las máquinas («Machine learning»), que amenazan con «permitir conflictos armados en los que se combatirá de una forma nunca vista hasta ahora, y en una escala de tiempo tan rápida que los humanos ni siquiera podrán comprenderla».

A lo largo de nuestra historia reciente se han prohibido las armas químicas, las biológicas, las nucleares y las bombas de racimo. Pero todas ellas se prohibieron después de haber sido utilizadas y comprobadas sobre el terreno sus terribles consecuencias. El único caso de arma prohibida antes de su despliegue fue en 1998, cuando Naciones Unidas prohibió el uso de rayos láser para cegar a los soldados enemigos. Cabe subrayar que aunque la tecnología necesaria está ampliamente disponible, hasta ahora la prohibición se ha cumplido a rajatabla.

¿Podría suceder lo mismo con los AKM, cuyo uso se considera inmoral por parte de la mayoría de las personas?

Según Toby Walsh, profesor de IA en la Universidad de Nueva Gales del Sur y uno de los promotores de la carta de Melbourne, nos encontramos, en cuanto al desarrollo de la Inteligencia Artificial, en una encrucijada. De hecho, y a pesar de que se puede utilizar esa tecnología para hacer frente a problemas como la desigualdad, el cambio climático o la crisis económica, también es posible aplicarla a la industria de la guerra.

Por eso, afirma Walsh, «Tenemos que tomar decisiones hoy, y elegir cuál de estos futuros es el que queremos».

http://ABC.es - Miércoles 23 de Agosto del 2017

MEMS and wireless options: User localization in cellular phones

Integrations of MEMS sensors with signal conditioning and radio communications form “motes” with extremely low-cost and low-power requirements and miniaturized form factor. Now standard features in modern mobile devices, MEMS accelerometers and gyros can be combined with absolute positioning technologies, such as GNSS or other wireless technologies, for user localization.

Navigation has been revolutionized by micro-electro-mechanical systems (MEMS) sensor development, offering new capabilities for wireless positioning technologies and their integration into modern smartphones.

These new technologies range from simple IrDA using infrared light for short-range, point-to-point communications, to wireless personal area network (WPAN) for short range, point-to multi-point communications, such as Bluetooth and ZigBee, to mid-range, multi-hop wireless local area network (WLAN, also known as wireless fidelity or Wi-Fi), to long-distance cellular phone systems, such as GSM/GPRS and CDMA.

With these technologies, navigation itself has become much broader than just providing a solution to location-based services (LBS) questions, such as “Where am I?” or “How to get from start point to destination?”

It has moved into new areas such as games, geolocation, mobile mapping, virtual reality, tracking, health monitoring and context awareness.

MEMS sensors are now essential components of modern smartphones and tablets. Miniaturized devices and structures produced with micro-fabrication techniques, their physical dimensions range from less than 1 micrometer (μm, a millionth of a meter) to several millimeters (mm).

The types of MEMS devices vary from relatively simple structures having no moving elements to complex electromechanical systems with multiple moving elements under the control of integrated microelectronics.

Apart from size reduction, MEMS technology offers other benefits such as batch production and cost reduction, power (voltage) reduction, ruggedization and design flexibility, within limits.

Wireless sensor technology allows MEMS sensors to be integrated with signal-conditioning and radio units to form “motes” with extremely low cost, small size and low power requirements.

New miniaturized sensors and actuators based on MEMS are available on the market or in the development stage.

Today’s smartphone sensors can include MEMS-based accelerometers, microphones, gyroscopes, temperature and humidity sensors, light sensors, proximity and touch sensors, image sensors, magnetometers, barometric pressure sensors and capacitive fingerprint sensors, all integrated to wireless sensor nodes.

These sensors were not initially intended for navigation. For instance, accelerometers are used primarily for applications such as switching the display from landscape to portrait as well as gaming.

These embedded sensors, however, are natural candidates for sensing user context. Because of their locating capabilities, people are getting used to the location-enabled life.

MEMS accelerometers and gyros, for instance, can be employed for localization in combination with absolute positioning technologies, such as GNSS or other wireless technologies.

WIRELESS OPTIONS IN SMARTPHONES

Various wireless standards have been established. Among them, the standards for Wi-Fi, IEEE 802.11b and wireless PAN, IEEE 802.15.1 (Bluetooth) and IEEE 802.15.4 (ZigBee) are used more widely for measurement and automation applications.

All these standards use the instrumentation, scientific and medical (ISM) radio bands, including the sub-GHz bands of 902–928 MHz (US), 868–870 MHz (Europe), 433.05–434.79 MHz (US and Europe) and 314–316 MHz (Japan) and the GHz bands of 2.4000-2.4835 GHz (worldwide acceptable).

In general, a lower frequency allows a longer transmission range and a stronger capability to penetrate through walls and glass.

However, due to the fact that radio waves with lower frequencies are more easily absorbed by materials, such as water and trees, and that radio waves with higher frequencies are easier to scatter, effective transmission distance for signals carried by a high-frequency radio wave may not necessarily be shorter than that of a lower frequency carrier at the same power rating.

The 2.4-GHz band has a wider bandwidth that allows more channels and frequency hopping and permits compact antennas.

Wireless Fidelity. Wi-Fi (IEEE 802.11) is a flexible data communication protocol implemented to extend or substitute for a wired local area network, such as Ethernet. The bandwidth of 802.11b is 11 Mbits and it operates at 2.4 GHz frequency.

Originally a technology for short-range wireless data communication, it is typically deployed as an ad-hoc network in a hot-spot. Wireless networks are built by attaching an access point (AP) to the edge of a wired network.

Clients communicate with the AP using a wireless network adapter similar to an Ethernet adapter. Beacon frames are transmitted in IEEE 802.11 Wi-Fi for network identification, broadcasting network capabilities, synchronization and other control and management purposes.

Timers of all terminals are synchronized to the AP clock by the timestamp information of the beacon frames. The IEEE 802.11 MAC (Media Access Control) protocol utilizes carrier sensing contention based on energy detection or signal quality.

RSSs and MAC addresses of the APs are location-dependent information that can be adopted for positioning. For localization of a mobile device, either cell-based solutions or (tri)lateration and location fingerprinting are commonly employed.

Bluetooth. A wireless protocol for short-range communication, Bluetooth (IEEE 802.15.1) uses the 2.4-Hz, 915-MHz and 868-MHz ISM radio bands to communicate at 1 Mbit between up to eight devices. It is mainly designed to maximize the ad-hoc networking functionality (Wang et al., 2006).

Compared to Wi-Fi, the gross bit rate is lower (1 Mbps), and the range is shorter (typically around 10 m). On the other hand, Bluetooth is a “lighter” standard, highly ubiquitous (embedded in most phones) and supports several other networking services in addition to IP. For positioning either tags (small size transceivers) or Bluetooth low energy (BLE) iBeacons are common.

Each tag has a unique ID that can be used for localization. iBeacon is a low-energy protocol developed by Apple; compatible hardware transmitters, typically so-called beacons, broadcast their identifier to nearby portable electronic devices.

The technology enables smartphones, tablets and other devices to perform actions when in close proximity to an iBeacon whereby a universally unique identifier picked up by a compatible app or operating system is transmitted.

The identifier and several bytes sent with it can be used to determine the device’s physical location, track customers, or trigger an LBS action on the device such as a check-in on social media or a push notification.

One application is distributing messages at a specific point of interest — for example, a store, a bus stop, a room or a more specific location like a piece of furniture or a vending machine. This is similar to previously used geopush technology based on GNSS, but with a much reduced impact on battery life and much extended precision.

Another application is an indoor positioning system, which helps smartphones determine their approximate location or context. With the help of an iBeacon, a smartphone’s software can approximately find its relative location to an iBeacon.

iBeacon differs from some other LBS technologies as the broadcasting device (beacon) is only a one-way transmitter to the receiving smartphone, and necessitates a specific app installed on the device to interact with the beacons.

This ensures that only the installed app (not the iBeacon transmitter) can track users, potentially against their will, as they passively walk around the transmitters. Localization is based on proximity sensing and cell-based solutions.

ZigBee. ZigBee is an IEEE 802.15.4-based specification for a suite of high-level communication protocols used to create personal area networks with small, low-power digital radios.

ZigBee operates in the ISM radio bands: 2.4 GHz in most jurisdictions worldwide, 784 MHz in China, 868 MHz in Europe and 915 MHz in the U.S. and Australia. Data rates vary from 20 kbit/s (868-MHz band) to 250 kbit/s (2.4-GHz band).

It adds network, security and application software and is intended to be simpler and less expensive than other WPANs such as Bluetooth or Wi-Fi.

Owing to its low power consumption and simple networking configuration, ZigBee is best suited for intermittent data transmissions from a sensor or input device.

Applications include wireless light switches, electrical meters with in-home displays, traffic management systems and other consumer and industrial equipment that requires short-range low-rate wireless data transfer.

Distances are limited to 10–100 m line-of-sight, depending on power output and environmental characteristics. ZigBee localization techniques usually use measurement of signal strength (RSS-based positioning) in conjunction with (tri)lateration and fingerprinting.

COMPARING STANDARDS

Table 1 compares the three wireless standards most suitable for a wireless sensor network. The standards also address the network issues for wireless sensors. Three types of networks (star, hybrid and mesh) have been developed and standardized.

TABLE 1. Comparison of Wi-Fi, Bluetooth and ZigBee.

Bluetooth uses star networks, composed of piconets and scatternets. Each piconet connects one master node with up to seven slave nodes, whereas each scatternet connects multiple piconets, to form an ad-hoc network. ZigBee uses hybrid star networks of multiple master nodes with routing capabilities to connect slave nodes, which have no routing capability.

The most efficient networking technology uses peer-to-peer mesh networks, which allow all the nodes in the network to have routing capability. Mesh networks allow autonomous nodes to self-assemble into the network and allow sensor information to propagate across the network with high reliability and over an extended range.

They also allow time synchronization and low power consumption for the “listeners” in the network, thus extending battery life. When a large number of wireless sensors need to be networked, several levels of networking may be combined.

For example, an IEEE 802.11 (Wi-Fi) mesh network comprised of high-end nodes, such as gateway units, can be overlaid on a ZigBee sensor network to maintain a high level of network performance.

A remote application server (RAS) can also be deployed in the field close to a localized sensor network to manage the network, to collect localized data, to host web-based applications, to remotely access the cellular network via a GSM/GPRS or a CDMA-based modem and, in turn, to access the internet and remote users.

ESTIMATION METHODS

The three most common position estimation methods are cell-based positioning (cell-of-origin, CoO), (tri) lateration and location fingerprinting, regarding achievable positioning accuracies as well as their advantages and disadvantages.

They provide different level of accuracies ranging from dm up to tens of m. Compared to (tri)lateration and fingerprinting, the principle of operation of CoO is the most straightforward and simplest. Disadvantages range from the requirement of a large number of devices or receivers as well as their performance in dynamic environments.

All these techniques provide absolute localization capabilities. Their disadvantage is that position fixes are lost if no coverage or signal availability is available.

Thus, combination with other technologies to bridge loss of lock of wireless signals (for example, no GNSS reception) is required. In smartphones, motion sensors exists that can be employed for inertial navigation (IN). In this article, these sensors are also referred to as inertial sensors.

In the simplest case, a position solution can be obtained from the relative measurements of the inertial sensors via dead reckoning (DR). The accelerometers, for instance, can be used by a pedestrian to count steps while walking and the gyroscope and magnetometer can provide the direction of movement.

These sensors have therefore substantially won on importance for navigation solutions.

MEMS LOCATION SENSORS

For many navigation applications, improved accuracy and performance is not necessarily the most important issue, but meeting performance at reduced cost and size is.

In particular, small navigation sensor size allows the introduction of guidance, navigation and control into applications previously considered out of reach. In this context, the small size, extreme ruggedness and potential for very low-cost and weight means of MEMS gyros and accelerometers have been, and will be, able to utilize inertial guidance systems — a situation that was unthinkable before MEMS.

The reduction in size of the sensing elements, however, creates challenges for attaining good performance. In general, the performance of MEMS inertial measurement units (IMUs) continues to be limited by gyro performance, which is typically around 10 to 30 deg/h, rather than by accelerometer performance, which has demonstrated tens of micro-g or better.

MEMS has struggled to reach high-accuracy tactical-grade quality.

MEMS Accelerometors. MEMS accelerometers are either pendulous/displacement mass type or resonator type. The former use closed-loop capacitive sensing and electrostatic forcing while the latter are based on resonance operation.

Both can detect acceleration in two primary ways: either displacement of a hinged or flexure-supported proof mass under acceleration, producing a change in a capacitive or piezoelectric readout, or frequency change of a vibrating element caused by a change in its tension induced by a change of loading from a seismic-proof mass.

Pendulous types can meet a wide performance range from 1 mg for tactical systems down to 25 μg. Resonant accelerometers or VBAs can reach higher performance down to 1 μg.

MEMS-Based Gyroscopes. For MEMS INS, attaining suitable gyro performance is more difficult to achieve than accelerometer performance. Fundamentally, MEMS gyros fall into four major areas: vibrating beams, vibrating plates, ring resonators and dithered accelerometers.

Gyroscopes are usually built as hybrid solutions, with sensor and electronics as two separate chips. The operational principle for all vibratory gyroscopes is based on the utilization of the Coriolis force.

If a mass is vibrated sinusoidally in a plane, and that plane is rotated at some angular rate Ω, then the Coriolis force causes the mass to vibrate sinusoidally perpendicular to the frame with amplitude proportional to the angular rate Ω.

Measurement of the Coriolis-induced motion provides knowledge of the angular rate Ω. This rate measurement is the underlying principle of all quartz and silicon micro-machined.

These gyroscopes are usually designed as an electronically driven resonator, which are often fabricated out of a single piece of quartz or silicon. The output is demodulated, amplified and digitized. Their extremely small size, combined with the strength of silicon, makes them ideal for very high-acceleration applications.

For purely surface micro-mechanical gyroscopes, given their small sizes and capacitances, monolithic integration is an option to be considered not so much for cost as for performance.

Combined IMUs. Further interest in all-accelerometer systems, which are also referred to as gyro-free, arises because high-performing small gyroscopes are very difficult to produce. Two approaches are typically used. In the first, the Coriolis effect is utilized.

Typically, three opposing pairs of monolithic MEMS accelerometers are dithered on a vibrating structure (or rotated). This approach allows the detection of the angular rate Ω. In the second, the accelerometers are placed in fixed locations and used to measure angular acceleration.

In both approaches, the accelerometers also measure linear acceleration, enabling a full navigation solution. In the direct approach, however, the need to make one more integration step makes it more vulnerable to bias variations and noise, so the output errors grow by an order of magnitude faster over time than when using a conventional IMU.

However, these devices only provide tactical-grade performance, and are most useful in GNSS-aided applications. The concept of a navigation-grade all-accelerometer IMU requires accelerometers with accuracies on the order of nano-g’s or better, and with large separation distances.

Use of all-accelerometer navigation for GNSS-unavailable environments will likely require augmentation with other absolute positioning techniques. Further sensor size reductions are underway through the combination of two in-plane (x- and y-axis) and one out-of- plane (z-axis) sensors on one chip. These multi-axes gyroscopes and accelerometer chips produce IMUs as small as 0.2 cm3.

Barometric Sensors. Barometric pressure sensors embedded in smartphones and other mobile devices demand small size, low cost and high-accuracy performance. The key element of a pressure sensor is a diaphragm containing piezoresistors which can be formed by ion implantation or in-diffusion.

Applied pressure deflects the diaphragm and thereby changes the resistance of the piezoresistors. By arranging the piezoresistors in a Wheatstone bridge, an output signal voltage can be generated. The measurement sensitivity of the pressure sensor is determined by the strain at the bottom plane of the diaphragm, whereby larger strain leads to higher sensitivity.

These altimeters are increasingly used in smartphones and other navigation systems. They can enable altitude determination of the user, for example, to determine the correct floor in a multi-storey building.

Pedestrian Dead Reckoning (PDR). The MEMS accelerometers embedded in the mobile device can be used to estimate the distance traveled from the accelerations made while walking, and magnetometers and gyroscopes to obtain user heading. Starting from a known position, determined by GNSS or other absolute positioning technique, the current position of the user can then be dead-reckoned using observations of the inertial sensors.

DR techniques differ from other localization techniques because the position is always calculated relative to the previously calculated position and no correlation with the real position can be made. PDR can give the best available information on position; however, it is subject to significant cumulative errors, i.e., either compounding, multiplicatively or exponentially, due to many factors as both velocity and direction must be accurately known at all instants for position to be determined accurately.

The accuracy of PDR can be increased significantly by using other, more reliable methods  — GNSS or another absolute positioning technique such as Wi-Fi — the combination with inertial sensors produces more reliable and accurate navigation.

Altitude Determination. For navigation, determination of the altitude of the user can be of great importance, for example in determining the correct floor in a multi-storey building. Barometric pressure sensors can provide this data, augmenting the inertial sensors that can usually only provide reliable 2D localization.

Furthermore, if only three GNSS satellites are visible, providing a 2D positioning solution, pressure sensors can aid 3D localization.

Altitude determination with a barometric pressure sensor can be performed relatively from a given start height — for example, obtained from GNSS outside the building or from a known height point in the indoor environment.

As the user walks inside the building and up stairs or elevator to other floors, differences in air pressure can be calculated using a simple relationship between the pressure changes and height differences.

For conversion of the air pressure in a height difference, the mean value of the temperature at both stations is also required; MEMS infrared temperature sensors are increasingly found in smartphones to provide this.

Activity Detection. Low-cost inertial and motion sensors provide a new platform for dynamic activity pattern inference. Human activity recognition aims to recognize the motion of a person from a series of observations of the user’s body and environment.

A single biaxial accelerometer can classify six activities: walking, running, sitting, walking upstairs, walking downstairs and standing.

Until recently, sensors on the body have been used for activity detection, and until recently only a few studies have used a smartphone to collect data for activity recognition.

Smartphone accelerometers recognize acceleration in three axes as shown in Figure 1. Different motion sequences can thereby be ascertained.

Figure 1. Smartphone coordinate frame (left) and global horizontal coordinate system (right).

If a smartphone is held horizontally in the hand during a forward motion, then an acceleration in the y-axis is induced. When working with accelerations, two approaches can be applied to measure the linear displacement: integration of the accelerations or step detection combined with step size estimate.

In the first case, the distance traveled can be theoretically calculated by integrating the accelerations once for velocity, twice for distance.

Due to the double integration, however, any error in the signal will propagate rapidly, so the drift on the received signals from the accelerometer makes it impossible to use integration for walks of more than a few seconds.

The Zero Velocity Update (ZUPT) technique, where the velocity is reset to zero between every consecutive step when the foot is stationary for a small amount of time, can overcome this. Any error produced during one step has no influence on following steps. ZUPT can only be used when the accelerometer is placed on the foot, taking advantage of the stationary period between footsteps.

In the latter case, the distance traveled is obtained from step counts by processing the fluctuating vertical accelerations, which cross zero twice with every step. When the number of steps and the step size are acquired, the distance can be calculated by multiplication.

Figure 2 shows the recorded acceleration of a walking person in the z-axis, with significant maxima and minima that enable step-counting. Correction for the gravity effect on the x-, y- and z-axes of the smartphone’s local coordinate system is key to the correct determination of accelerometer-derived distance traveled. The MEMS-based three-axis accelerometer allows the device to detect the force applied along the three axes in order to accomplish specific functions based on predefined configurations.

Figure 2 . Typical recording of accelerometer sensor data in z-axis of a walking user.

The mobile device can be oriented in such that one of the axes is aligned in the direction of movement or heading (for example, y-axis), the positive x-axis is pointing rightward and the positive z-axis is upward (compare Figure 1). When the y-axis is horizontal, the gravity effect will be fully reflected on the z-axis.

However, a cell phone will most likely be placed by a user into a pocket or bag. Therefore, most existing step detection algorithms cannot be used directly — adjustments have to be made to take into account the orientation of the accelerometers. Because a phone can be placed with any side up or down, the accelerations are observed to determine which axis is the most vertical one.

The accelerations of the axis that is pointing directly to the center of the Earth has a value of 1 g due to gravity. So if the smartphone is lying flat on a table, with the display side up, then the z-axis of the accelerometer would theoretically have a value of 1,000 mg.

If the phone is put crooked (not along one of the axes) in someone’s pocket, the values will be lower than 1,000 mg. So to detect which accelerometer has the most vertical axis, the absolute average of the last 30 samples, or 1.2 seconds, of all three axes of the accelerometers of which the absolute value is closest to 1 g, is the most vertical axis and the accelerometer to use.

SYSTEM COMPARISON

Table 2 compares the most commonly used location sensors and systems in mobile devices classified depending on their positioning capability — absolute or relative — and on their type. A meaningful combination in form of a hybrid solution will produce the best performance for localization of a mobile smartphone user.

Kealy Table2

TABLE 2. Specifications of the most commonly used location sensors and systems in mobile devices.

Combining MEMS, Wireless. For the majority of indoor navigation systems, the combination of MEMS sensors and wireless options provides the optimal solution. MEMS sensors can provide relative positioning information, with an unbounded accumulation of location errors over time. Wireless systems provide an absolute position in either a local or global coordinate frame, independent of previous estimates without integrating measurements over time. The combination of these two technologies takes advantages of the strengths of both, producing a more robust position solution.

CONCLUSIONS

The increasing ubiquity of location-aware devices has pushed the need for robust GNSS-like positioning capabilities in difficult environments.

No single sensor or technique can meet the positioning requirements for the increasing number of safety- and liability-critical mass-market applications.

Integration is one approach to improving performance level, but a significant step change in high-performance positioning in GNSS-difficult environments, higher performance level are required from MEMS and wireless technologies.


ALLISON KEALY is a professor of geospatial science at Royal Melbourne Institute of Technolgy University, Australia. She holds a Ph.D. in GPS and geodesy from the University of Newcastle upon Tyne, UK. He is co-chair of FIG Working Group 5.5. Ubiquitous Positioning and vice president of the International Association of Geodesy (IAG) Commission 4: Positioning and Applications.

GÜNTHER RETSCHER is associate professor in geodesy and geoinformation at the Vienna University of Technology, with a Ph.D. in applied geodesy. He is co-chair of IAG Sub-Commission 4.1 on Emerging Positioning Technologies and GNSS Augmentation and of the IAG/Fig Working Group on Multi-Sensor Systems.

Miércoles 09 de Agosto del 2017 - http://gpsworld.com

Conoce más sobre mi

conocemas

cuento

 

JUEGOS

Denuncias Públicas

denuncias

Consultor Internacional

consultor

Sociedad Colombiana de Topógrafos

sct

Ingeniería Mundial y Geomática

ingenieria

Cosas que no entiendo

cosas

Invitaciones

invitaciones

Mis Acciones en la SCI

SCI ACC

Recomendados del mes

recomendado

palilibrio