Skip to content

Layers are not unitary #1

@orielkiss

Description

@orielkiss

Hi!
I am trying to use this package to build a unitary neural network. I have a set of normalized input states and would like to pass them through a unitary layer and obtain normalized states at the end. I am using

'def forward(x):
module = orthax.haiku.OrthogonalLinear(output_size=n,with_bias=False,t_init=t_init)
return module(x)
'
which should be simply a unitary matrix. Without optimization, everything is fine, but once i start training, the outputs states become unnormalized. I know i could renormalize everything but i am trying to use it to simulate a quantum circuit.

My initial parameters are :
{'orthogonal_linear': {'thetas': DeviceArray([0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j,
0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j,
0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j,
0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j,
0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j, 0.+0.j,
0.+0.j], dtype=complex64)}}

and i optimizing a fidelity loss function between the outputs and target states. Does someone has any idea what is going on?

Thank you again for this awesome package.
Best
Oriel

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions