Posts

Showing posts from March 13, 2017

The projects IT leaders intend to execute in 2017

Image
The positive dynamic around emerging technologies including: real-time Analytics, Self-service, and Metadata Management, Artificial Intelligence and Machine Learning, to name a few, is very exciting. IT leaders are steadily aware of the stakes in terms of costs, productivity, efficiency, flexibility and streamlined ROI. The findings of a survey from Talendof 169 IT decision makers indicate big data, analytics and governance top the list of priorities in 2017. The findings show real-time analytics, metadata management, and self-service data access make up nearly 70 percent of the projects IT leaders intend to execute in 2017, as they get their data in shape to pursue artificial intelligence (AI) and machine learning in the future.

Awesome curve of the geospatial analytics market

Image
The positive dynamic is real with the global geospatial analytics market estimated 29.27 Billion in 2016 and projected to reach USD 71.69 billion by the end of 2021, with a CAGR of 19.6%% from 2016 to 2021. Increasing monetization of geospatial data for decision making across government and business organizations; increasing integration and convergence of geospatial analytics with IoT and big data is expected to boom the geospatial analytics market.

Here is why Geospatial analysis uses geo-references

Image
One can clearly observe that, advancements in social media, smart devices, location sensors, and communications technology have allowed organizations to collect geo-references about every possible event. Geospatial analysis uses these geo-references to build GIS, maps, graphs, cartograms and other useful information to derive correlation between historical shifts, present occurrences and forecast future trends.

Here is what has encouraged the development of the intelligent buildings marketplace

Many analysts agree on the fact that, the promise of revenue growth, improvements in operational efficiency, and meeting corporate sustainability goals are the fundamental drivers that have encouraged the development of the intelligent buildings marketplace. Another, great news is that, this momentum continues to grow as intelligent building technology becomes more sophisticated and as shifting customer demands, climate change mitigation goals, power reliability concerns, and budget constraints increase interest. According to a new white paper from @NavigantRSRCH, evolutions in technology, delivery models, and customer demand are building the thrust for the digital transformation of commercial buildings. The rapid deployment of cost-effective data acquisition devices and the integration of IT with traditional building automation and controls are changing the industry of facilities management.

Key stakes fueling satellite-based telemetry

Findings converge on the fact that, with an increase in the demand for reliable telemetry infrastructure, the demand for satellite-based telemetryincreases. Then Satellite telemetry is used for various civil, commercial, government, and military applications. For example, it is used by researchers to track the movement of targets (such as animal and birds) on Earth. Such tracking is done using orbiting satellites. The satellite receives radio signals from transmitters attached to the target. Satellite telemetry is also widely used for military and defense applications.

Here is how the data collected by satellites is used

For those who are unfamiliar, the data collected by satellites is used for diverse applications such as agriculture, management of water resources, urban development, mineral prospecting, environment protection, forestry, border and maritime security, drought and flood forecasting, ocean resources, and disaster management, to name a few.

Prioritize ethical considerations in the creation of autonomous and intelligent technologies

Very interesting to see that, steadily technologists are encouraged to prioritize ethical considerations in the creation of autonomous and intelligent technologies. In fact, the growth of autonomous and intelligent systems is significantly elevating the importance of accountability, transparency, and ethical considerations in regards to the creation of algorithms. IEEE P7003 will allow algorithm creators to communicate to regulatory authorities and users that the most up-to-date best practices are used in the design, testing and evaluation of algorithms in order to avoid unjustified differential impact on users.

The rapid growth of algorithmic driven services and growing concerns

Image
The rapid growth of algorithmic driven services is very promising so that, it has led to growing concerns among civil society, legislators, industry bodies and academics about potential unintended and undesirable biases within intelligent systems.
Based on this reality, the approval of IEEE P7003 in line with a number of initiatives and projects are creating an all-encompassing framework to ensure end users and stakeholders are protected by prioritizing ethics in the development of new technologies.

The approval of IEEE P7003™ Algorithmic Bias Considerations

I have a great pleasure to recall that, IEEEP7003 TM defines specific methodologies and processes to help certify the elimination of negative bias in the creation of algorithms. In effect, the new standards development project aims to provide individuals and organizations creating algorithms the certification-oriented methodologies that clearly articulate accountability and clarity around how algorithms target, assess and influence users and stakeholders of autonomous or intelligent systems.

Search and analyze public datasets, build machine learning models and grow your data science expertise

Image
For those who are unfamiliar, Founded in 2010, Kaggle is home to the world's largest community of data scientists and machine learning enthusiasts. More than 800,000 data experts use Kaggle to explore, analyze and understand the latest updates in machine learning and data analytics. Kaggle is the best place to search and analyze public datasets, build machine learning models and grow your data science expertise.
Now, Kaggle and Google Cloud support machine learning training and deployment services, while offering the community the ability to store and query large datasets.
The momentum lowers the barriers of entry to AI and makes it available to the largest community of developers, users and enterprises, so they can apply it to their own unique needs.

Google Cloud Platform within consistency, availability, and scalability in transactional database applications

From the announcement of Google Cloud Spanner last month, steadily, Google Cloud aims to meet the most stringent customer requirements for consistency, availability, and scalability in transactional database applications. Cloud Spanner joins Google Cloud Datastore, Google Cloud Bigtable and Google Cloud SQL to deliver a complete set of databases on which developers can build great applications across a spectrum of use cases. Many third-parties have joined the Cloud Spanner ecosystem: Xplenty, iCharts, Looker, MicroStrategy and Zoomdata provide visual data analytics, to name a few.

Prepare structured and unstructured data for analysis with clicks, not code

I have pleasure to recall that, Google Cloud Dataprep creates a data pipeline in Google Cloud Dataflow, cleans the data and exports it to BigQuery or other destinations. This means that, you can now prepare structured and unstructured data for analysis with clicks, not code. You apply to be part of the private beta here.

Google Cloud Dataprep to widely mitigate the time it takes to prepare data for analysis

Very interesting to see that, Google Cloud Dataprep is a new serverless browser-based service that can dramatically cut the time it takes to prepare data for analysis. According to Google, it intelligently connects to your data source, identifies data types, identifies anomalies and suggests data transformations. Data scientists can then visualize their data schemas until they're happy with the proposed data transformation.

Export data from Adwords, DoubleClick and YouTube directly into BigQuery

A warm welcoming of the public beta of the BigQuery Data Transfer Service, as a great tool to automate data movement from select Google applications directly into BigQuery. In effect, with BigQuery Data Transfer Service, marketing and business analysts can easily export data from Adwords, DoubleClick and YouTube directly into BigQuery, making it available for immediate analysis and visualization using the extensive set of tools in the BigQuery ecosystem.

Google Cloud Functions , as a new exciting serverless environment to build and connect cloud services

At the core of stakes, we have the ability to build and connect cloud services without having to manage infrastructure. I can observe that, Cloud Functions can be a great way to build lightweight backends, and to extend the functionality of existing services. For example, Cloud Functions can respond to file changes in Google Cloud Storage or incoming Google Cloud Pub/Sub messages, perform lightweight data processing/ETL jobs or provide a layer of logic to respond to webhooks emitted by any event on the internet. Developers using Firebase can build backends integrated with the Firebase platform. Cloud Functions for Firebase handles events emitted from the Firebase Realtime Database, Firebase Authentication and Firebase Analytics.