Computer Science, asked by Krishnanunni9206, 5 months ago

ADF enables to create pipelines that ingest data from disparate data stores

Answers

Answered by saloniverma6231
0

Answer:

jdjdhdhhjjjdhshdjjdjsjshshajajhagwvwwhhwhsgshshhwhwhwhwhwhwgcwgwhwfwhwjwgwggw

Answered by nitin1998mishra
0

Answer:

Azure data factory

Explanation:

It can process and transform the data by using compute services such as HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning.  ADF is a cloud data integration service, to compose data storage, movement, and it can not be store the data and it is does not some features like as SSIS. With multiple options and configurations available for Azure Data that helps Azure users build ETL pipelines for their enterprise data.The data you need for your analytics workloads may exist in many disparate forms and locations.

The three steps of ETL are:

Extract: First, data is extracted from a source location such as a file or database.

Transform: Next, the data is transformed from its source format in order to fit the target location’s schema.

The data you need for your analytics workloads may exist in many disparate forms and locations, both internal and external to your organization. For maximum efficiency, this data needs to be stored in a centralized repository, such as a data warehouse. ETL is a crucial part of the data migration process, making it easier and more efficient to integrate many different data sources.

Load: Finally, the transformed data is loaded into a target location such as a data warehouse, where it can be used for analytics and reporting.

Similar questions