forked from apache/tvm
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TIR] Pass to hoist allocations from tvm main to function signature #4
Open
mikepapadim
wants to merge
5
commits into
mbaret:relax-aot
Choose a base branch
from
mikepapadim:relax_workspace_meta_uplift
base: relax-aot
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
[TIR] Pass to hoist allocations from tvm main to function signature #4
mikepapadim
wants to merge
5
commits into
mbaret:relax-aot
from
mikepapadim:relax_workspace_meta_uplift
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mbaret
requested changes
Dec 16, 2022
@T.prim_func | ||
def __tvm_main__(a: T.handle, b: T.handle, sid_0_1: T.Ptr[T.uint8], sid_1_1: T.Ptr[T.uint8], output: T.handle): | ||
# function attr dict | ||
T.func_attr({"global_symbol": "test_mod___tvm_main__", "runner_function": True, "target": T.target({"kind":"llvm", "tag":"", "keys":["cpu"]}), "input_vars": [a, b], "output_vars": [output]}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we add another function attr to indicate workspace vars?
b_buffer = T.match_buffer(b, [T.int64(5), T.int64(7)], dtype="float32", align=16) | ||
output_buffer = T.match_buffer(output, [T.int64(5), T.int64(7)], dtype="float32", align=16) | ||
# body | ||
sid_0 = T.match_buffer(sid_0_1, [140], dtype="uint8", strides=[1], elem_offset=0, align=16) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should keep the storage_scope set on these
mbaret
pushed a commit
that referenced
this pull request
Feb 13, 2023
* Add initial IRBuilder. * Add function output to irbuilder; update based on new AST. * Add call method; clean up bindings * Add test. * Add multifuction test * Move implementation to C++; infer shape and type * update op python hook * More tests and bug fix * Add comments. * Update shape/type inference. * Restructure code; add python type hint. * Cleanup code. * Rebase; address comments. * Add call intrinsic. * nits. * Remove call op. * Migrate scope to C++ using tvm::With. * Address naming. * Add GetBlocks API. * Unify EmitOutput APIs; add more comments. * Remove shape and type deduction code. * Also remove the shape/type attr interface. * Address comments. * Differentiate global and local function. * Reset counter after building func/block. * Rebase. * Remove shape infer builtin. * Return from void function as empty tuple. Co-authored-by: Michalis Papadimitriou <mikepapadim@hotmail.com>
mbaret
pushed a commit
that referenced
this pull request
Feb 13, 2023
* [IR] Introduce StructInfo * StructInfoFunctor and Analysis Support * [TVMScript] Parse type/shape annotation with StructInfo * remove runtime type assign * Remove type/shape during parsing (#2) * Normalizer prep: simple checks and legacy function renaming. * Struct info deduction in BlockBuilder. * Two TODOs * StructInfo Normalizer Fixes (#3) * StructInfo AST Fix * Fix Extern Func Deduction and shape mutator. * Update VoidStructInfo & globalvar (#4) * Fix passes and proper sinfo propagation. * Refactor EraseToWellDefined to Enable Remapping * [WIP] First stab at symbolic param tracking * Update EraseToWellDefined to support symbolic shape return (#5) * fix R.shape with ndim (apache#6) * Remove update shape/type * Address review comment, AnnotateTypeShape=>AnnotateStructInfo * Update include/tvm/script/ir_builder/relax/frame.h Co-authored-by: Ruihang Lai <ruihangl@cs.cmu.edu> * Address comments * Update printer to use structinfo (apache#7) * Update Error mechanism to prep for obj loc based reporting * Symbolic shape aware function call return value derivation. The main flow works as follows: - Match and populate shape_var_map and var_map by visit each pair of param and call arguments. - Call EraseToWellDefined to map the ret parameter to new result. * [ANALYSIS] Refactor well-form to only look at struct info. * Update comments according to reviews. * Update include/tvm/relax/struct_info.h Co-authored-by: Ruihang Lai <ruihangl@cs.cmu.edu> Co-authored-by: Siyuan Feng <Hzfengsy@sjtu.edu.cn> Co-authored-by: Tianqi Chen <tqchen> Co-authored-by: Ruihang Lai <ruihangl@cs.cmu.edu>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.