We are always busy! There's a new Apache Arrow 19.0.0 release: https://lnkd.in/dkj5uKHn but wait! We also have new releases for Apache Arrow Go 18.1.0, Apache Arrow Rust 53.4.0 and Apache Arrow ADBC libraries 16. Check out those too!
Apache Arrow的动态
最相关的动态
-
Did a part of boolean optimizations ( https://lnkd.in/gBApgviu and hope I can do some optimizations on Unpacking in 17.0.0 )
We did a new Arrow Release! See the highlights for Apache Arrow 16.0.0 here: https://lnkd.in/d9rfYaN2
要查看或添加评论,请登录
-
With the Trino 466 release, users can now leverage Apache Ranger policies to manage access control for resources such as catalogs, schemas, tables, and columns. This includes advanced capabilities like dynamic column masking and row filtering. https://lnkd.in/gmeb5NsR For over a decade, Apache Ranger has been enhancing its authorization and auditing capabilities, extending its reach across diverse data sources, no matter where they reside. Enterprises, vendors, and community contributors continue to expand Apache Ranger's coverage to address evolving governance needs. At its core, Apache Ranger's policy model offers a rich set of features, including: - Classification-based policies - Attribute-based policies - Role-based policies - Column masking and row filtering - Delegated policy administration - Custom conditions - Ability to actively deny accesses - Security zones - Centralized access audit logs - Governed data sharing (in development) Apache Ranger also provides an intuitive UI for policy management, as well as Java, Python, and REST APIs for seamless integration. Go Rangers! ??
要查看或添加评论,请登录
-
??? Here's your weekly ASF release roundup! ??? ?? HttpComponents Core 5.3 is now available for download. To download: https://bit.ly/3XxoUK7 To read the release notes: https://bit.ly/3TwkPF2 ?? Apache Groovy 4.0.23 has been released. Apache Groovy is a multi-faceted programming language for the Java virtual machine. To download: https://bit.ly/3qNI7uM ?? Apache Beam 2.59.0 has been released. This release includes bug fixes, features, and improvements detailed on the Beam blog: https://bit.ly/47CNk9F ?? Apache Arrow ADBC 14 has been released. Apache Arrow is a columnar in-memory analytics layer designed to accelerate big data. The release is now available from here: https://bit.ly/3BkaOnS ?? Apache Flink CDC 3.2.0 is now available. Apache Flink CDC is a distributed data integration tool for real time data and batch data. To download: https://bit.ly/3BsFY9E #opensource #data #bigdata #Java
要查看或添加评论,请登录
-
-
Dynamic Watermarking With IMGProxy and Apache APISIX https://lnkd.in/duYw4NNS Last week, I described how to add a dynamic watermark to your images on the JVM. I didn't find any library, so I had to develop the feature, or, more precisely, an embryo of a feature, by myself. Depending on your tech stack, you must search for an existing library or roll up your sleeves. For example, Rust offers such an out-of-the-box library. Worse, this approach might be impossible to implement if you don't have access to the source image. Another alternative is to use ready-made components, namely imgproxy and Apache APISIX. I already combined them to resize images on-the-fly.
要查看或添加评论,请登录
-
Apache Groovy tip: Flatten nested lists with ease. The spread operator untangles them for cleaner code https://lnkd.in/gvyX_Ms9 #OpenSource
要查看或添加评论,请登录
-
Apache Hudi provides a doc/instruction that we can follow to run the demo on the local Docker. To follow the described steps, the prerequisites have to be installed first. I installed them by using "Conda" with the preconfigured environment.yml file. "brew" or some other can be used to install the prerequisites, but these days I use "Conda" to create an isolated working environment. BTW, the post doesn't go through the Apache Hudi demo. It only shows how to use Conda to create the Conda Environment with the prerequisites. https://lnkd.in/gwftzjMf
要查看或添加评论,请登录
-
"Avro" is a row-based format. It was created by Apache. Each record contains a header that describes the structure of the data in the record. This header is stored as JSON. The data is stored as binary information. An application uses the information in the header to parse the binary data and extract the fields it contains. Avro is a good format for compressing data and minimizing storage and network bandwidth requirements. https://avro.apache.org/
要查看或添加评论,请登录
-
Microsoft Office licenses are very expensive yet they are the backbone of almost every organisation. I have been working with office conversions since around 2008. Before discovering Apache POI I used iText PDF, and I think it was JExcel for producing Excel files from Java programs. Let me know if you want to cut down on Microsoft Office costs for your company and I can incorporate these tools into your organisation with much cheaper licensing.
??? Here's your weekly ASF release roundup! ??? ?? Apache POI allows programmers to create, modify, and display MS Office files using Java programs. POI 5.3.0 is now available for download: https://bit.ly/3WxJV87 Release notes can be found here: https://bit.ly/3YaZ14V ?? Apache Iceberg is an open table format for huge analytic datasets. Iceberg 1.6.0 is now available for download: bit.ly/46o5Jqn ?? Apache PDFBox 2.0.32 is now available for download: bit.ly/41lR2jy ?? Apache Storm is a distributed, fault-tolerant, and high-performance realtime computation system that provides strong guarantees on the processing of data. Storm 2.6.3 is available for download: bit.ly/3qtSbZM ?? Apache Kyuubi is a distributed and multi-tenant gateway to provide serverless SQL on Data Warehouses and Lakehouses. Kyuubi 1.9.2 is now available for download: bit.ly/3SqHnWX #opensource #ASF25years
要查看或添加评论,请登录
-
-
???I Just Contributed to The Apache Software Foundation, and It Feels AMAZING!??? Guess what? I just made my?FIRST EVER?open source contribution to?Apache Beam, and I’m absolutely buzzing with excitement! ?? This wasn’t just a small tweak—I?DROVE UP THE CODE COVERAGE?in the?#sql?package (written in?Golang) from?35% to 70%! ?? That’s right, I added a bunch of unit tests to make the codebase more robust and reliable. Here’s what I did: ? Added comprehensive?Go unit tests?to address issue #21384. ? Wrote test cases like?#TestOutputType,?#TestTransform, and?#TestMultipleOptions?to ensure proper validation and configuration. ? Followed the Golang codebase style and conventions while adding clear documentation and comments. Why This Matters: This contribution wasn’t just about writing code, it was about making the The Apache Software Foundation ecosystem stronger, more reliable, and ready for the future. I’m so proud to have played a small part in that. A?HUGE THANK YOU?to?Robert Burke?for guiding me through this process—your support and mentorship made this possible! ?? If you’ve ever thought about contributing to open source, DO IT. It’s one of the most rewarding experiences ever. Start small, ask questions, and just go for it. You’ve got this! ?? You can check out the PR here: ???https://lnkd.in/d2Nf5S9m #OpenSource #ApacheBeam #GoLang #FirstContribution #SoftwareDevelopment #CodingJourney #Community
要查看或添加评论,请登录