Making life easier with Hive Tables

Making life easier with Hive Tables

Have you ever tried finding all the delta tables within all the databases in the hive catalog? It takes some serious python and dataframe coding to accomplish this task. The high level algorithm is below.

1 - grab a list of all databases

show databases;

2 - grab a list of all tables in a given database such as dim.

show tables from dim;

3 - for each table, grab detailed table information.

describe table extended dim.employee;

Obviously, there are some lists (dataframes) to be traversed to obtain this information. I will be putting this complete solution on my blog in the future. Is there a better way to solve this problem?

Databricks has implemented the information schema in the release of the unity catalog. As you can see, it went GA yesterday. Please check you Azure region for availability.

If you are using the unity catalog, this complex task is boiled down to the following pyspark query.

select * from information_schema.tables where DATA_SOURCE_FORMAT = 'delta'

In short, this is one of the many new and exciting features of the unity catalog.

Dan Davis

Data Architect | Author

2 å¹´

This is great to know, thanks for sharing. You just made my life much easier!

赞
回复

要查看或添加评论,请登录

John Miner的更多文章

  • Why use Tally Tables in the Fabric Warehouse?

    Why use Tally Tables in the Fabric Warehouse?

    Technical Problem Did you know that Edgar F. Codd is considered the father of the relational model that is used by most…

  • Streaming Data with Azure Databricks

    Streaming Data with Azure Databricks

    Technical Problem The core functionality of Apache Spark has support for structured streaming using either a batch or a…

    1 条评论
  • Upcoming Fabric Webinars from Insight

    Upcoming Fabric Webinars from Insight

    Don't miss the opportunity to boost your data skills with Insight and Microsoft. This webinar series will help you…

  • How to develop solutions with Fabric Data Warehouse?

    How to develop solutions with Fabric Data Warehouse?

    Technology Details The SQL endpoint of the Fabric Data Warehouse allows programs to read from and write to tables. The…

  • Understanding file formats within the Fabric Lakehouse

    Understanding file formats within the Fabric Lakehouse

    I am looking forward to talking to the Cloud Data Driven user group on March 13th. You can find all the presentation…

    3 条评论
  • Engineering a Lakehouse with Azure Databricks with Spark Dataframes

    Engineering a Lakehouse with Azure Databricks with Spark Dataframes

    Problem Time does surely fly. I remember when Databricks was released to general availability in Azure in March 2018.

  • Create an Azure Databricks SQL Warehouse

    Create an Azure Databricks SQL Warehouse

    Problem Many companies are leveraging data lakes to manage both structured and unstructured data. However, not all…

    2 条评论
  • How to Load a Fabric Warehouse?

    How to Load a Fabric Warehouse?

    Technology The data warehouse in Microsoft Fabric was re-written to use One Lake storage. This means each and every…

  • My Year End Wrap Up for 2024

    My Year End Wrap Up for 2024

    Hi Folks, It has been a very busy year. At the start of this year I wanted to learn Fabric in depth.

    1 条评论
  • Virtualizing GCP data with Fabric Shortcuts

    Virtualizing GCP data with Fabric Shortcuts

    New Technology Before the invention of shortcuts in Microsoft Fabric, big data engineers had to create pipelines to…

社区洞察

其他会员也浏览了