Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add entangling_layer to models.encodings #1260

Merged
merged 7 commits into from
Mar 14, 2024
Merged

Conversation

renatomello
Copy link
Contributor

@renatomello renatomello commented Mar 13, 2024

Checklist:

  • Reviewers confirm new code works as expected.
  • Tests are passing.
  • Coverage does not decrease.
  • Documentation is updated.

@renatomello renatomello added the enhancement New feature or request label Mar 13, 2024
@renatomello renatomello added this to the Qibo 0.2.6 milestone Mar 13, 2024
@renatomello renatomello self-assigned this Mar 13, 2024
Copy link

codecov bot commented Mar 13, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.95%. Comparing base (a603ddd) to head (0b9355f).

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #1260   +/-   ##
=======================================
  Coverage   99.95%   99.95%           
=======================================
  Files          72       72           
  Lines       10454    10490   +36     
=======================================
+ Hits        10449    10485   +36     
  Misses          5        5           
Flag Coverage Δ
unittests 99.95% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@MatteoRobbiati MatteoRobbiati left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super fine with this kind of feature!

I think it would be nicer to start thinking how to move these features (or, better, if we want this) to the qiboml package.

Personally, I think this is exactly that kind of feature (I mean the encodings in general) which would fit perfectly with it.

I would say this because it could be crucial to understand how to organize the code structure and because I am currently working on the variational model and, consequently, on the encoding and decoding strategy.

@renatomello
Copy link
Contributor Author

renatomello commented Mar 13, 2024

Super fine with this kind of feature!

I think it would be nicer to start thinking how to move these features (or, better, if we want this) to the qiboml package.

Personally, I think this is exactly that kind of feature (I mean the encodings in general) which would fit perfectly with it.

I would say this because it could be crucial to understand how to organize the code structure and because I am currently working on the variational model and, consequently, on the encoding and decoding strategy.

Data encoders are very important for QML, but not exclusive to QML. For instance, it's important for "regular" QFT and / or Quantum Phase Estimation applications (e.g. the example of unary encoding application to option pricing included in qibo's documentation, or the paper(s) in which the current implementation of the unary encoder was based on.). After all, a data encoder is nothing but a state preparation, so the applications are broader than QML.

@MatteoRobbiati
Copy link
Contributor

Data encoders are very important for QML, but not exclusive to QML. For instance, it's important for "regular" QFT and / or Quantum Phase Estimation applications (e.g. the example of unary encoding application to option pricing included in qibo's documentation, or the paper(s) in which the current implementation of the unary encoder was based on.). After all, a data encoder is nothing but a state preparation, so the applications are broader than QML.

Thanks for this reply! I see, this perfectly makes sense. So the best way will be to integrate the encodings into qiboml, picking them up from Qibo. My only concern is not to make redundancy between the two packages (but at this point this has to be done in qiboml).

@MatteoRobbiati MatteoRobbiati self-requested a review March 13, 2024 15:47
Copy link
Contributor

@MatteoRobbiati MatteoRobbiati left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A final review, with a small suggestion follows.

TypeError, f"nqubits must be type int, but it is type {type(nqubits)}."
)

if nqubits <= 0.0:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you use this to define a Circuit, I think a similar error will be raised directly from line 332.

@scarrazza scarrazza modified the milestones: Qibo 0.2.6, Qibo 0.2.7 Mar 13, 2024
@renatomello renatomello added this pull request to the merge queue Mar 14, 2024
Merged via the queue into master with commit f0548cd Mar 14, 2024
23 checks passed
@renatomello renatomello deleted the encoding_entangling branch March 14, 2024 04:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants