User Details
- User Since
- Feb 28 2022, 9:41 AM (17 w, 13 h)
Today
- Merge with latest changes (D128000)
- Update based on feedback
Thu, Jun 16
Thu, Jun 2
With the removal of sparse_tensor.init operation, this change no longer makes sense.
May 24 2022
I plan to add custom reduction using sparse_tensor.binary later. This PR will focus solely on explicit starting values for non-custom reductions.
May 3 2022
@aartbik Thanks for reviewing and helping me converge on a good solution for lowering.
Once the build passes, I will merge it.
Make helper functions static
Apr 28 2022
Few more comments
Okay, adding more comments.
Updates based on feedback
Apr 27 2022
Sorry my comments are out of order with the new PR. They should be read as if they happened before the new PR. I forgot to submit them until now.
Apr 26 2022
Change back to one Operation *
Apr 22 2022
Change how repeat calls to unary and binary are handled
Apr 21 2022
Apr 19 2022
@aartbik I think this PR is ready for a full review. It's passing all the tests.
Add test involving linalg.index
Merge latest from main
Apr 18 2022
- Make the absent region of sparse_tensor.unary work
Apr 16 2022
Updates based on feedback
Apr 13 2022
Apr 4 2022
@aartbik Please take a look and let me know your thoughts on my general approach. I also indicated where I am stuck and could use some help, as I don't truly understand the lattice-set theory.
Mar 17 2022
Mar 16 2022
@aartbik Unless you find some more updates to the descriptions, I think this is ready. What is the next step? I don't think I have commit rights, so you will need to commit on my behalf.
Minor text updates
Mar 15 2022
Remove SameTypeOperands trait for binary
Mar 14 2022
Mar 10 2022
Change binary and unary signature
@aartbik Let me know your thoughts about my response to adding kind to unary. Once we have agreement, I will update the diff.
Mar 9 2022
Incorporate comments into design of binary and unary
Mar 8 2022
Mar 7 2022
Mar 6 2022
Mar 4 2022
Try again with all the changes this time
Okay, so my understanding was tainted by how a Github PR works. I'll try to update the differential to include all 3 commits, rather than the latest commit (which is what arc diff defaults to).
They are still there in the history. I did 3 git commits locally: the big one, a minor fix, and then a clang-format one.
While I appreciate that you can view all 3 separate, I don't see a way to view all changes at once, which feels like the most important one to view.
Look at Diff 2 for the real changes.
Fix Formatting
Introduce new binary and unary op for sparse_tensor dialect