![]() In this blog, two features of Azure Synapse Analytics service will be used: ![]() Using Azure Synapse, a limitless analytics service that brings together data integration, enterprise data warehousing and big data analytics. Synapse provides different tools and approaches to work with the collected data, including T-SQL scripts, spark code, and machine learning models. This blog will review couple of options to analyze the exported data. Last, you might want to retain data also externally to Log Analytics workspace, and you want to reduce costs, by converting the data to Parquet format, again after performing data manipulation\filtering in the process. ![]() This capability can be used in several scenarios:įirst, you are looking to run machine learning or batch learning models on top of data collected to a Log Analytics workspace.Īnother option, you need to export Log Analytics data to another warm storage, might be for auditing purposes, this will also allow you to perform data manipulation\filtering in the process. Exported data can be retained for very long periods of time relatively cheap. Log analytics new “ Data export” feature allows you to also send the collected logs to Azure Data Lake Storage. In this blog, you will learn how to use Azure Synapse Analytics to query data collected into Azure Log Analytics and was exported to Azure Data Lake storage.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |