Skip to content

Add logs to debug 500 error in Synapse#13

Open
dylanw-oss wants to merge 8 commits intomasterfrom
DML_500_logs
Open

Add logs to debug 500 error in Synapse#13
dylanw-oss wants to merge 8 commits intomasterfrom
DML_500_logs

Conversation

@dylanw-oss
Copy link
Owner

Related Issues/PRs

#xxx

What changes are proposed in this pull request?

Briefly describe the changes included in this Pull Request.

How is this patch tested?

  • I have written tests (not required for typo or doc fix) and confirmed the proposed feature/bug-fix/change works.

Does this PR change any dependencies?

  • No. You can skip this section.
  • Yes. Make sure the dependencies are resolved correctly, and list changes here.

Does this PR add a new feature? If so, have you added samples on website?

  • No. You can skip this section.
  • Yes. Make sure you have added samples following below steps.
  1. Find the corresponding markdown file for your new feature in website/docs/documentation folder.
    Make sure you choose the correct class estimators/transformers and namespace.
  2. Follow the pattern in markdown file and add another section for your new API, including pyspark, scala (and .NET potentially) samples.
  3. Make sure the DocTable points to correct API link.
  4. Navigate to website folder, and run yarn run start to make sure the website renders correctly.
  5. Don't forget to add <!--pytest-codeblocks:cont--> before each python code blocks to enable auto-tests for python samples.
  6. Make sure the WebsiteSamplesTests job pass in the pipeline.

@github-actions
Copy link

Hey @dylanw-oss 👋!
Thank you so much for contributing to our repository 🙌.
Someone from SynapseML Team will be reviewing this pull request soon.

We use semantic commit messages to streamline the release process.
Before your pull request can be merged, you should make sure your first commit and PR title start with a semantic prefix.
This helps us to create release messages and credit you for your hard work!

Examples of commit messages with semantic prefixes:

  • fix: Fix LightGBM crashes with empty partitions
  • feat: Make HTTP on Spark back-offs configurable
  • docs: Update Spark Serving usage
  • build: Add codecov support
  • perf: improve LightGBM memory usage
  • refactor: make python code generation rely on classes
  • style: Remove nulls from CNTKModel
  • test: Add test coverage for CNTKModel

To test your commit locally, please follow our guild on building from source.
Check out the developer guide for additional guidance on testing your change.

//
// val processedData = featurizedModel.transform(convertedLabelDataset).select(colstoSelect.map(col): _*)

val processedData = featurizedModel.transform(convertedLabelDataset).select(getFeaturesCol, getLabelCol)
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line of the code is to unblock the cache and fit afterwards.

There is a performance issue here, it will cache the whole dataset, but we only need the "features" and "label" column.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But the change will make another problem, if user set WeightCol, it will throw exception that weightCol is not available.
I tried to add it above (the commented code) but it's not working,

if (isDefined(weightCol) || !$(weightCol).isEmpty)
this will throw exception, java.util.NoSuchElementException: Failed to find a default value for weightCol

println(s"$this - [trainInternal] residualsDF1 columns ${residualsDF1.columns.mkString(",")}, size ${residualsDF1.count()}")
println(s"$this - [trainInternal] residualsDF2 columns ${residualsDF2.columns.mkString(",")}, size ${residualsDF2.count()}")

val coefficients = Array(residualsDF1, residualsDF2).map(regressor.fit).map(_.coefficients(0))
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the logs, DML notebook run is being blocked here.
Any idea?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dylanw-oss dylanw-oss changed the title Add logs Add logs to debug 500 error in Synapse Mar 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants