Innov-Agri is a large outdoor trade fair for agricultural machinery and equipment. In 2016, it took place in a field near Orléans. A lot of constructors were there and offered live demonstrations of their products. The company I work for was an exhibitor so I went there on the first day of the fair. I walked around and took a few pictures.
Above, John Deere had a large stand.
Above, a few trains could be used to move around the fair.
Above, cow licking hand.
Above, autonomous robot for the weeding of row crops by Ecorobotix.
Above, Amazone seeder demo.
Above, drone at the stand of Wanaka (my employer).
Some time ago, I saw this article about using the Qgis2threejs plugin to export a QGIS map as a 3D visualisation in the browser, thanks to three.js and WebGL. When I recently tried to follow the post to reproduce the results, I had some problem sourcing the data (related to Vienna, Austria) so instead I searched for data related to Paris to create a similar scene. Here is the result corresponding to the screenshot above and another one with aerial photos, both showing the Montmartre area of Paris.
The first step was to create a standard QGIS map with all the necessary data:
- Base map: OSM Mapnik tiles. To display them in QGIS, there are multiple ways: Either use the OpenLayers plugin, by selecting the OpenStreetMap layer, or the TileLayer plugin, by first copying this tile description file into the ~/.qgis2/python/plugins/TileLayerPlugin/layers/ directory and then selecting the OSM layer with the plugin.
- Elevation data (DEM): I downloaded the tile that covers most of France from the SRTM Tile Grabber. It provides a Tiff that will be displayed by QGIS in levels of gray by default.
- Tree data: a SHP layer with the locations of trees lining the streets of Paris (Arbres d’Alignement), obtained from the French OpenData portal (data.gouv.fr). I actually added the same layer twice in the QGIS project, so that in Qgis2threejs, I could export one of the layers as the trunks of the trees (cylinders) and one as the leaves (spheres). I kept one of the layers visible, rendered as a circle with some transparency so it would look like a shadow of the tree, and hid the other.
- Building data: a SHP layer with the location of buildings in Paris (Volumes bâtis), obtained from the Paris OpenData portal. It only contains the first 50000 buildings of the full dataset. The full data is pretty big, so this will do for my needs. If necessary, the GeoJSON dataset contains all the buildings.
The Qgis2threejs plugin will use what is currently in the QGIS map window as the texture for the 3D terrain. Therefore only the layers that should be in the output should be kept visible: This is why some of the layers are unchecked in the QGIS UI shown in the screenshot above.
All the layers, even if not visible in the QGIS map, can be configured to be output in vector form, potentially with some transformation applied, like an extrusion or a sphere. These are the settings that I used:
- DEM: The SRTM elevation data.
- Tree leaves: Green sphere of radius 3m at 5m above the ground.
- Tree trunks: Beige cylinder of radius 0.5m and height 5m at 0m above the ground.
- Buildings: Extrusion of 10m, with random colors.
I also applied some vertical exaggeration in the World settings.
After launching the conversion (which does not take very long for such a small extent), the browser opens a web page with the exported visualisation.
Above, an alternative version, with aerial photos.
I released a free app called Tokyo Ramen Map a few weeks ago on the Play Store. It is a simple map application showing all the ramen restaurants in central Tokyo, along with ratings. I built it for my needs so it is quite bare bones. The main point was to make it work offline since I don’t have a data connection outside my apartment.
It uses cartographic data from the OpenStreetMap project, rendered on the device through the Mapsforge library. The positions and ratings of the ramen restaurants come from scraping the RamenDB website. All the code for the app is open source and available on github. It can also be used to generate OpenStreetMap-based apps for any city, with the option of pre-loading points (like the Ramen app) or letting the user add their own. As an example of the latter, I have released apps for Tokyo (Tokyo Offline Map), without the ramen shop layer, and Geneva (Geneva Offline Map).
Android has added support for Bluetooth Low Energy (aka Bluetooth Smart) and thus iBeacon (a profile of BLE) in version 4.3. However, on my Google Nexus 10 (2012 edition, now running Android 4.4 aka KitKat), support is disabled in the official version provided by Google, even though the hardware supports it. Since I want to try out (and maybe develop) apps that use iBeacons, that makes me a very sad panda… Thankfully, there is a way to enable it, by replacing the original Bluetooth-related libraries on the device with patched libraries that add support for Bluetooth Low Energy.
I have created an update ZIP file for the Nexus 10 (aka Manta) and Android 4.4 build KRT16S: The file can be downloaded from here. It was generated by getting the source of Android (branch android-4.4_r1.2), applying this patch by Manuel Naranjo to reenable Bluetooth Low Energy, recompiling (with target aosp_manta-eng), replacing the content of the update ZIP created by XDA Developers user Keine with the newly compiled libraries and, finally, signing the new ZIP.
!!! Although this seems to work fine for me, I am very new to this so use the ZIP and the instructions below at your own risk !!!
To install (I assume the Nexus 10 has never been rooted or unlocked, which was the case for me):
- Obtain the Android SDK (for the fastboot and adb tools)
- Boot to fastboot mode (pressing the Power-Volume Up-Volume Down buttons for some time) and OEM-unlock your device (using “fastboot oem unlock”)
- Boot to the factory recovery mode (pressing the Power-Volume Up buttons to get past the red triangle) and wipe the user data
- Boot to Android and reenable USB debugging
- Boot to fastboot mode and flash a custom recovery image like ClockworkMod (using “fastboot flash recovery recovery-clockwork-220.127.116.11-manta.img “)
- Boot to the custom recovery mode and apply the update ZIP (by choosing the file from /sdcard; the file can be copied to the Nexus 10 using “adb push …”)
- Boot to Android (no need to apply the custom recovery image permanently or root the device)
The patch can be tested with an Android app like iBeacon Locate (by the makers of this opensource iBeacon library for Android). I also needed an iBeacon transmitter: I used this opensource Mac OS X application that can run on my Macbook Air running Mavericks (but an iPhone or dedicated hardware could work too).
Here is the Mac OS X application broadcasting the iBeacon advertisement:
And here is what I get on the Nexus 10:
I was surprised to learn that the Nexus 10 cannot be updated in the usual way (OTA) after this kind of operation. It is possible to rollback to the original libraries of build KRT16S using this ZIP (the files were obtained through “adb pull /system/lib ~/backup” before applying the BLE update ZIP). Then the new Android build can be installed (for example using sideloading) and the BLE update ZIP reapplied on top (I have tested this with 4.4.2 aka build KOT49H). After an OTA update and before applying the update ZIP, the libraries will have to be backed up to create a new rollback ZIP for that version. Or maybe more simply, the device can be reset to its factory version.
Recently, I have been researching techniques for indoor positioning, where GPS does not work. Although it is not widely deployed yet, one technology that looks quite interesting is called “iBeacon”, an Apple-designed (and currently unofficially documented) profile of Bluetooth Low Energy.
An iBeacon uses a Bluetooth signal to broadcast an ID to any listening device (most likely a mobile phone). From that, the distance from the iBeacon can be estimated from the signal strength and additional processing could yield more precise positioning. The iOS SDK also supports built-in geofencing and notification with iBeacons in the Core Location Framework. The use of Bluetooth Low Energy makes power consumption quite low and an iBeacon could last a long time even on a small battery so that only minimal maintenance is required.
As to reading an iBeacon signal from a phone, on the Apple side, it only works on iOS7 (and the hardware has to support it, although iPhone 4S and up do). On the Android side, version 4.3 has added the necessary Bluetooth LE support in the SDK and there already is an open source Android library to interact with an iBeacon. On the broadcasting side, an iPhone can serve as an iBeacon for testing purposes and there are prototypes of dedicated low-powered devices (for example, Estimote).
There are lots of potential applications for indoor positioning and not just on the marketing / retail sector, for example in the healthcare or home automation fields. Of course, iBeacon is not the only game in town for that (Wifi positioning comes to mind) but the fact that it is already being deployed bodes well for its chance of success: