Skip to content

Conversation

@Deependra-Patel
Copy link
Member

No description provided.

Change-Id: I7caaf0a6012df33ca39a421c241be9b6c2669303
@gemini-code-assist
Copy link

Summary of Changes

Hello @Deependra-Patel, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the dataproc_ml library by integrating support for online inference via Google Cloud's Vertex AI Endpoints. It introduces a dedicated handler that allows users to seamlessly perform predictions on Spark DataFrames using deployed Vertex AI models, complete with configurable parameters and robust error handling for prediction mismatches. The addition streamlines the process of leveraging powerful cloud-based AI models within Spark workflows.

Highlights

  • New VertexEndpointHandler: Introduced a new VertexEndpointHandler class to enable online inference on Spark DataFrames using Google Cloud's Vertex AI Endpoints.
  • Batched Prediction Logic: The core VertexEndpoint class handles sending batched prediction requests to Vertex AI endpoints, optimizing performance and ensuring proper handling of responses.
  • Configurable Handler: The VertexEndpointHandler provides a fluent API for configuring project, location, prediction parameters, batch size, and dedicated endpoint usage.
  • Comprehensive Testing: Both unit and integration tests have been added to validate the functionality, ensuring correct initialization, batched request handling, and end-to-end operation with a live Vertex AI endpoint.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a VertexEndpointHandler to perform online inference against a Vertex AI Endpoint from Spark. The implementation includes the handler itself, a model wrapper class, and comprehensive unit and integration tests. The code is well-structured and follows a builder pattern for easy configuration. I've identified a potential runtime error in the VertexEndpoint class due to improper handling of the batch_size parameter and a point of fragility in the integration tests due to a hardcoded resource ID. My comments provide suggestions to improve robustness and test reliability.

aiplatform.init(project=project, location=location)
self.endpoint_client = aiplatform.Endpoint(endpoint_name=endpoint)
self.predict_parameters = predict_parameters
self.batch_size = batch_size

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The batch_size parameter can be None, but the call method doesn't handle this case, which will lead to a TypeError when it's used in range(). The docstring for __init__ also states that batch_size defaults to 10, but this default is not applied here. To prevent runtime errors and align with the documentation, you should assign a default value if None is provided.

Suggested change
self.batch_size = batch_size
self.batch_size = batch_size if batch_size is not None else 10

location = os.getenv("GOOGLE_CLOUD_LOCATION", "us-central1")
# TODO: Replace with endpoint creation during test run which shouldn't
# take more than 20 mins
endpoint_name = "1121351227238514688"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The integration test relies on a hardcoded endpoint name. This makes the test fragile and dependent on an external resource that might change or be deleted, causing test failures. While the TODO comment acknowledges this, it's important to prioritize making this test self-contained. Consider using a test fixture that programmatically creates a temporary endpoint for the test run and tears it down afterwards to ensure the test is reliable and independent of pre-existing infrastructure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant