-
Notifications
You must be signed in to change notification settings - Fork 1.4k
[tmva][sofie] Add HardSigmoid operator for ONNX inference #20933
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
[tmva][sofie] Add HardSigmoid operator for ONNX inference #20933
Conversation
sanjibansg
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @AdityaDRathore,
Thanks for your contribution and developing the operator implementation for Hard-sigmoid. The implementation looks good to me. But, we are in the process of consolidating unary operations (like activation functions) into the ROperator_BasicUnary.hxx file. Can you make the corresponding changes?
Thanks.
Test Results 22 files 22 suites 3d 14h 55m 11s ⏱️ For more details on these failures, see this check. Results for commit 7963b08. |
Hi @sanjibansg, thanks for the review! I started refactoring to move this into Since HardSigmoid is stateful (it must store alpha and beta attributes parsed from the ONNX node), it doesn't fit naturally into the current BasicUnary design without forcing all other simple operators (like Abs, Cos) to carry unused attribute overhead. I also checked how other stateful unary operators are handled and saw that Given this precedent, do you think it is safer to keep HardSigmoid separate to preserve the clean, stateless nature of BasicUnary? Or do you have a specific pattern in mind for handling attributes within the traits system? |
Ah yes! That is correct. Since this is a stateful operator, maybe better to keep this separately. Can you fix the formatting issues flagged in the CI? In the meantime, I will wait for the review of @lmoneta before we can merge this, Thanks! |
Implemented the HardSigmoid operator (ONNX opset 6+) for Project SOFIE. This operator is a prerequisite for supporting MobileNetV3 architectures (specifically for the HardSwish activation). Implemented in `ROperator_HardSigmoid`. - Uses hexfloat constants (`0x0p+0f`, `0x1p+0f`) for clamp bounds to ensure bit-exact reproducibility and prevent implicit double-precision promotion. - Attributes `alpha` and `beta` are strictly handled as floats to avoid `CVTSS2SD` overhead in the generated inference loop. - Added GTest suite `TestSofieHardSigmoid` validating numerical correctness against SciPy data and verifying AVX-friendly code generation.
7963b08 to
0c6123d
Compare
|
Hi @sanjibansg |
This Pull Request
This PR implements the
HardSigmoidoperator for the SOFIE (System for Optimized Fast Inference code Emit) engine. This is a standalone operator implementation compliant with the ONNX specification (Opset 6+).The implementation is critical for supporting MobileNetV3 architectures (also for the
HardSwishactivation).Changes or fixes:
New Operator: Implemented
TMVA::Experimental::SOFIE::ROperator_HardSigmoid.std::fmaxandstd::fminwith specific hexfloat constants (0x0p+0f,0x1p+0f) to ensure the compiler emits pure AVXMAXSS/MINSSinstructions without intermediate double-precision conversions.It Enforces
f-suffix onalphaandbetacoefficients to prevent implicit promotion todoubleduring inference.Updated
RModelParser_ONNXto correctly parsealphaandbetaattributes (defaults are set to0.2,0.5).Validation: Added a new unit test suite
TestSofieHardSigmoid.Checklist:
TestSofieHardSigmoidpassing)This PR fixes # (No specific issue linked, strictly feature addition)
cc: @guitargeek @moneta @devajithvs @bellenot