912 Commits

Author SHA1 Message Date
Jan Kundrát
59f9f35817 CI: provision secrets for uploads to PyPI
Originally I was hoping that this is possible via the parent job
collection (oopt-zuul-jobs), but that's not the case, unfortunately, and
I have to do this in this project.

I created the token at PyPI's own web UI for my account, and saved the
secret to a file. Then I executed Zuul's helper script:

  $ ~/work/prog/openstack/zuul/zuul/tools/encrypt_secret.py \
    --infile secret-pypi --tenant TelecomInfraProject \
    https://zuul.vexxhost.dev/ Telecominfraproject/oopt-gnpy

Change-Id: I05ab853ae26a449212e832e0f7e261c9ed71e364
2020-06-17 23:01:33 +02:00
Jan Kundrát
1bb475671d docs: badges: show code coverage as well
One yellowish item among the badges, now that should make it look legit
and real 🌈.

Change-Id: I13adadb1edc538428d24cc9d50c0569f49c5d7a3
2020-06-17 21:52:09 +02:00
Jan Kundrát
566dedbdbb docs: badges: show code quality via LGTM.com
It gives us an A, so we can link there :).

Change-Id: I0613394e135bf3de3dfd9e984f970328a5aa2fa4
2020-06-17 21:43:08 +02:00
Jan Kundrát
b44c4cec16 docs: boast about the contributor count
...because why not, right?

Change-Id: Ib853d7b3496f2b7d915785e72bfa314bf82a2802
2020-06-17 21:31:06 +02:00
Jan Kundrát
93d11ba408 docs: badges: prepare for a future link to Zuul
Change-Id: Iff5ffa1f6d1623c8d5b612f7d899c4e8f3f7cb85
2020-06-17 21:20:57 +02:00
Jan Kundrát
637670fcfa docs: Move the gnpy-path-requests into README
This looks like something that's been "always" part of the Excel guide.
I think it is better placed in the README. After the releae we migth
want to improve the docs even more, but hey, that's something which
should be done after some discussion, not via these commits that I'm
going to self-approve shortly.

Change-Id: Icd73f4bb3a43f1a684ec7a364e4ebcc9b8e6af88
2020-06-17 21:19:13 +02:00
Jan Kundrát
7836297708 README: point all badges to the master branch
We want to deprecate `develop`, so let's start pointing to master now.

Change-Id: Ibf69c36f44fde7081238508d55137c8df21cfa60
2020-06-17 21:02:21 +02:00
Jan Kundrát
1f8b4ab9a2 docs: Move install instructions into the generated docs
The PowerShellSessionLexer within pygments has troubles parsing the
example as something valid, so let's revert to a generic one.

Change-Id: I188e1d7bf2f7229ad15c1f0443584344fd11bf84
2020-06-17 20:43:37 +02:00
Jan Kundrát
05eb312f4a docs: Do not run rstcheck on sphinx-consumed docs
Turns out that we're already running Sphinx with options that are strict
enough to perform all sorts of checks. The `rstcheck` standalone tool is
too strict, it warns about legitimate Sphinx constructs (hence the
ignore list), and also happens to issue false warnings. Let's rely on
Sphinx for stuff below docs/, but still invoke rstcheck for the
top-level README (and anything that's not included in the Sphinx docs,
really, which is right now the README, AUTHORS, and a JSON-specific doc
that I have yet to convert).

Bug: https://github.com/myint/rstcheck/issues/19
Change-Id: Ib787d11e7d21452570618288acdf15b3e85270a7
2020-06-17 20:43:37 +02:00
Jan Kundrát
a98e244abd docs: Fix Pygments highlighting
This will make it possible to use this document within Sphinx without
warnings.

Change-Id: I069cdc42b451102d4e731c8d848716126b0e518a
2020-06-17 20:23:58 +02:00
Jan Kundrát
c945bc40fe docs: move JSON and XLS instructions into the generated docs
Change-Id: I659dd8e53663286b1382d1786f46c5341bf7ea44
2020-06-17 20:23:58 +02:00
Jan Kundrát
749b9287a9 Merge remote-tracking branch 'origin/develop' in preparation of release 2.2
Change-Id: I3d503a9c1609e174f8b0a2e11afb646c879099cc
2020-06-17 12:55:12 +02:00
Jan Kundrát
202c76bd6e CI: enable Zuul builds
I'm hoping that merging this change will make Zuul handle that merge
commit.

Change-Id: I31fcc353ee6044355ebbd3ff61bc2ba1b33eb2c7
2020-06-17 12:50:15 +02:00
Jan Kundrát
eec0943ca2 Merge "Upload releases to PyPI upon tagging" into develop 2020-06-13 14:12:19 +00:00
Alessio Ferrari
06d59a5834 Introduce polarization mode dispersion (PMD)
Change-Id: I687591df4662884b734ec945e9968713019ea0fc
2020-06-12 09:08:22 +02:00
Alessio Ferrari
94949d955b Introduce computation of the chromatic dispersion
Change-Id: I3ee039154568d4255444fa8db5e89945851010f4
2020-06-12 08:45:46 +02:00
Alessio Ferrari
b74d0a4919 Add dispersion slope
Change-Id: Iced385787896793437be410a189c67e05da87714
2020-06-12 08:38:51 +02:00
Alessio Ferrari
c8fa7635e0 Bug fix in converting the dispersion D in beta2
The actual conversion formula includes the minus (-), not the absolute
value. We never noticed it as GNPy simulates only modern networks
based on uncompensated transmission which have not DCUs. In this case,
the sign of beta2 along a path is the same for all the spans and,
in this case, the actual amount of NLI does not change.

Change-Id: I60a61d00c578a1a0436231a2bda8e3b6256fc8b3
2020-06-12 07:35:40 +02:00
Jan Kundrát
4c6cfbda5d Merge changes from topic "moving-examples" into develop
* changes:
  Enable saving the network as converted from XLS
  Save either to JSON or to CSV, not to both
  CLI: Allow Raman in path_requests_run
  CLI: Unify handling of the network topology
  Remove unused variables
  CLI: show default values in --help
  Unify handling of the --equipment option in examples
  CLI: specify shared code options just once
  Tweak the --help output
  Remove unused statements
  XLS -> JSON conversion: add a nice program for this
  transmission_main_example: Do not write out a CSV file
  tests: don't clutter up the source dir with generated CSVs
  use load_json instead of open coding
  tests: Do not produce JSON files in the source tree
  Split JSON export from service XLS reading
  tests: Do not create JSON files in the source tree
  Do not always write out JSON data when reading XLS files
  Remove incomplete support for "fuzzy name matching"
  distribute example data along GNPy
  Do not create *_auto_design.json by default
  tests: remove something which looks like a path, but is not a valid path
  tests: show that the examples still work when directly invoked
  examples: add some additional descriptions help context
  Distribute our examples via setuptools
  tests: call our example entry points via functions
  examples: prepare for overriding sys.args
  examples: move path_requests_run to gnpy.tools
  examples: move transmission_main_example into gnpy.tools
  examples: use ansi_escapes
  examples: manual coding style tweaks
  examples: autopep8 -aaaaaaaaaa
  examples: autopep8
2020-06-11 20:31:54 +00:00
Jan Kundrát
80e9423590 Upload releases to PyPI upon tagging
Change-Id: Ia0543a5a091069e9503c8ba6a09930c35a0104f7
2020-06-10 16:39:09 +02:00
Jan Kundrát
78010aaaef Enable saving the network as converted from XLS
This is a feature that was requested by Esther due to their workflows at
Orange. There's also a standalone converter avaialable as
`gnpy-convert-xls`.

Change-Id: I1a483d168db0744fbf115e05e679e13b57d79398
2020-06-10 14:27:05 +02:00
Jan Kundrát
3857ab1dbb Save either to JSON or to CSV, not to both
I don't see a reason for that; the old debug texts were actively
misleading.

Change-Id: I2089bf8f87ec994770cc272e054650e18da4ef2a
2020-06-10 14:15:50 +02:00
Jan Kundrát
dbb09e4108 CLI: Allow Raman in path_requests_run
There's no need to limit this to just the transmission_main_example, so
let's unify this handling.

Change-Id: I585f407c7f80da12fd33baf7261c35c736d78df2
2020-06-10 14:11:44 +02:00
Jan Kundrát
94a8f3568a CLI: Unify handling of the network topology
Change-Id: I51c98ae13218715862fe9f585b2cc4b079498bee
2020-06-10 14:11:42 +02:00
Jan Kundrát
ae7c9321d0 Remove unused variables
They are ever read from, not even in any commented-out debugging code,
so let's nuke these.

Change-Id: I3188511adb28242dc40418cb3bb90b38bc4fdb14
2020-06-10 14:09:52 +02:00
Jan Kundrát
648cc3a8e5 CLI: show default values in --help
I think this is a little bit cleaner that duplicating the info about the
default for some of these options.

Change-Id: If218a26ede3e71628f4839b4e505c4f4aa217699
2020-06-10 14:09:52 +02:00
Jan Kundrát
f9e0d18a9d Unify handling of the --equipment option in examples
Let's use the --option format instead of positional arguments; that way
it's more obvious that it can be omitted. Note that this constitutes a
change of behavior for the path_requests_run example.

Change-Id: Ic6653cf419e1a8573c3585190a88fc51500f549d
2020-06-10 14:09:52 +02:00
Jan Kundrát
2d57fd9f85 CLI: specify shared code options just once
Change-Id: I79d2c9dfd630ee72ff1b87d4b24025a20d1e2ce2
2020-06-10 14:09:49 +02:00
Jan Kundrát
f053f32301 Tweak the --help output
Try to indicate whether an option takes just JSON, or a JSON or an XLS
file. Also add some extra descriptions.

Change-Id: Ifb81d46f6ac659da79b08201a414822e9c318a1e
2020-06-10 14:07:40 +02:00
Jan Kundrát
8e9d715e9f Remove unused statements
Change-Id: Ib859254468f02f7384d736fa9d7120a0ad1aaa15
2020-06-10 12:14:41 +02:00
Jan Kundrát
9fd55a5289 XLS -> JSON conversion: add a nice program for this
Esther mentioned that it is useful for her to be able to convert from
XLS files to JSON files. Let's add a full blown script for this.

I've also taken the liberty to refactor the code a bit so that there's
no default value, and to modernize everything with pathlib a little bit.

Change-Id: I80e50fc1280003910242ce1ff9fc9ae66e6d275b
2020-06-10 12:14:41 +02:00
Jan Kundrát
914d0dbecd transmission_main_example: Do not write out a CSV file
We talked about this earlier today on a call, and agreed with Esther and
Alessio that this is probably a relict from the past. The file does not
appear to contain much useful information, anyway, so let's try to
remove it and wait if someone complains.

Change-Id: I215eeb37498b28b15ece2300f4bbdd184ac52f4a
2020-06-10 12:14:41 +02:00
Jan Kundrát
c38fe72ff7 tests: don't clutter up the source dir with generated CSVs
Change-Id: Ice9287dea2ed16ece2594e21eeab4c69e927947e
2020-06-08 20:35:58 +02:00
Jan Kundrát
80f63d32ed use load_json instead of open coding
Change-Id: I43cfbb7272bfdd834fad63e6715932ff45aeac0b
2020-06-08 20:06:11 +02:00
Jan Kundrát
9030f8f84f tests: Do not produce JSON files in the source tree
Change-Id: I7b9c65b93ba2ce64beef4de050b44c49a85163b7
2020-06-08 19:43:51 +02:00
Jan Kundrát
0d5f1c7d80 Split JSON export from service XLS reading
We have a better module for this.

Change-Id: Id4b68d3ddb119f27df3dfac52277981558fc5e50
2020-06-08 18:47:15 +02:00
Jan Kundrát
5bc42332cd tests: Do not create JSON files in the source tree
Change-Id: I5ef71cc82466367fa9e9eea4d3f02453a4d2f469
2020-06-08 18:31:24 +02:00
Jan Kundrát
093b85d4a3 Do not always write out JSON data when reading XLS files
There's no reason for this, in fact, the code got easier to read when
that detour to disk gets removed.

Change-Id: I45db215898da962e625a7fea6eda57744e21ff8a
2020-06-08 18:31:21 +02:00
Jan Kundrát
0efa0d310d Remove incomplete support for "fuzzy name matching"
This is something which got added in bc9eee32, but it never got
finalized to have a user-visible effect. To the best of my knowledge, it
only created a file which was never used.

I removed code which created that file in 0d542f22, so let's clean up
the rest.

I think this should also restore functionality of running convert.py in
a standalone mode. Looking at the ArgParser, the invocation never
considered the names_matching parameter.

Change-Id: Id0f4aa1db2d22233f74fb273176168a16ace4072
2020-06-08 18:30:38 +02:00
Jan Kundrát
8eb5980ca9 distribute example data along GNPy
I would like to create a package for distribution to PIP, and this seems
like the path of least resistance.

This is, apparently, the way for shippign arbitrary data with Python
[1]. I've at least tried to make it user-firendly via adding a simple
utility which just prints out whatever that data path is.

[1] https://python-packaging.readthedocs.io/en/latest/non-code-files.html

Change-Id: I220ecad84b1d57d01e3f98f15befc700bd97c0b8
2020-06-08 18:30:36 +02:00
Jan Kundrát
754be7ca08 Do not create *_auto_design.json by default
It's a bad habit to write files into the source code repository. It will
also become impossible if gnpy is installed into a systemwide, possible
read-only location.

The old behavior can be reactivated by using an extra option to tell
GNPy where to put the generated file.

Change-Id: I9ad43890ca5886f516885de5938a4778870a06c5
2020-06-08 18:28:59 +02:00
Jan Kundrát
c5c5b693f2 tests: remove something which looks like a path, but is not a valid path
As I'm moving the top-level directory `examples/` to another place, I
wanted to clear the source of any mentions of examples which are not
actually valid paths.

Change-Id: If6cce20feacfbbb79549e865d06aa00fd2dcd08d
2020-06-08 18:28:59 +02:00
Jan Kundrát
7f816eb6e7 tests: show that the examples still work when directly invoked
Since Ic4a124a5cbe2bd24c56e4565d27d313fe4da703f, there was no automated
test which would check if the generated examples *really* work. When I
was playing with this, I managed to break it at least once (especially
when working on overriding sys.args, i.e.,
I53833a5513abae0abd57065a49c0f357890e0820).

This now requires an equivalent of `pip install` before the tests can be
run.

Change-Id: I595a3efe29b3ee13800c5cb71f28a5f370988617
2020-06-08 18:28:59 +02:00
Jan Kundrát
1009b44d2a examples: add some additional descriptions help context
The main reason for doing this is becasue of the next commit which
re-adds testing of the generated wrappers.

Change-Id: I7137c6cf7a5b414fc708a15b125eaf88e996366c
2020-06-08 18:28:56 +02:00
Jan Kundrát
3b61c6ca4c Distribute our examples via setuptools
Cc issue #352.

Change-Id: I31e67130540fabeb2666dea38da6c435236e7f5b
2020-06-05 18:04:53 +02:00
Jonas Mårtensson
cc11bd186c Update compute_constrained_path
This patch proposes a new implementation of the compute_constrained_path function based on the same method as the newly proposed compute_k_constrained_paths function, i.e. using shortest_simple_paths instead of all_simple_paths. This method is more efficient and avoids having to set a cutoff parameter. The new implementation should be identical to the old one from an external perspective, except that it finds a path with include node constraints in more cases.

Change-Id: Ia93b61c0af27076ed5088013bc87787a2920b629
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-06-03 20:57:39 +02:00
Jan Kundrát
0d1225824e tests: call our example entry points via functions
I would like to avoid that extra fork to a child Python interpreter (it
looks like something that can be easily avoided). It's something that's
possible now that the code ships just some trivial wrappers (which are,
in turn, needed for setuptools' `console_scripts`).

This cannot use the `capsysbinary` fixture for wrapping of stdout/stderr
due to something in pytest which already got fixed, but has not been
released yet (May 2020). Let's use `capfdbinary` which works fine.

Change-Id: Ic4a124a5cbe2bd24c56e4565d27d313fe4da703f
See-also: https://github.com/pytest-dev/pytest/pull/6926
2020-06-03 19:29:17 +02:00
Jan Kundrát
a2ecfd924c examples: prepare for overriding sys.args
...which will be done in the next commit. One has to be careful with
sys.argv here because it uses different indexing than when passing args
explicitly.

Change-Id: I53833a5513abae0abd57065a49c0f357890e0820
2020-06-03 19:29:03 +02:00
Jan Kundrát
95c3c9c488 examples: move path_requests_run to gnpy.tools
Change-Id: Id23dfa81bf9a9b347ea9c99f77d5735b62911d67
2020-06-03 19:22:53 +02:00
Jan Kundrát
11033a284f examples: move transmission_main_example into gnpy.tools
This converts our first free-standing example entry point into a
function. It will become very handy when we start distributing these
entry points via setuptools.

Change-Id: Icd2e4658337f93cd0b0301978e2dc640de0cc72b
2020-06-03 19:06:31 +02:00
Jan Kundrát
357e6cbf18 examples: use ansi_escapes
Change-Id: I958b00a7d7c079d9330cc0f4aec1f72b4a102223
2020-06-03 18:32:16 +02:00
Jan Kundrát
bfa6d29908 examples: manual coding style tweaks
Change-Id: Ie0ac42561afad331b8d22df0576ef071f358c5ed
2020-06-03 18:29:07 +02:00
Jan Kundrát
21e0589e7f examples: autopep8 -aaaaaaaaaa
Change-Id: I811bc80d9eca9df34c554362de62f98a897cbb05
2020-06-03 18:18:19 +02:00
Jan Kundrát
4d836246ec examples: autopep8
Change-Id: Ieed050fdfb9db57722b2b8de642c25ad2e4a50eb
2020-06-03 18:17:47 +02:00
Jan Kundrát
566943a099 Merge "Support propagation of single channel" into develop 2020-05-30 10:17:38 +00:00
Jan Kundrát
a4c1ea1f55 CI: Restore functionality via running on Python 3.6 on native packages
We have declared that we're supporting Python 3.6 as the minimal
required version of Python. Let's enforce this via testing. Let's also
do it via a long-term supported distro (CentOS8 in this case) so that we
can have someone else maintain stuff for us; it's easier to rely on
these prebuilt packages instead of downloading Python from upstream all
the time.

Also, Python 3.7 builds are currently broken in the CI because of some
breakage, either in the upstream zuul-jobs repo, or in Vexxhost's VM
images. So, now that we have Python 3.6 VMs available, use this to
restore CI functionality. That part can be reverted in future when
Debian Buster-based builds are working again.

I've also tried to use Python 3.8 on latest Fedora (version 32). There
were a couple of gotchas, including /tmp on tmpfs with not enough space
for building all required packages. Also, there's another issue with
building Pandas from scratch, so let's defer using Python 3.8 until
later. It's something that I will definitely look into, though, because
of significant speed improvements in that distro.

Change-Id: Ic19bf58875ba6d6afaee77f5c9b467261ec686d0
Depends-on: https://review.gerrithub.io/c/Telecominfraproject/oopt-zuul-jobs/+/494597
2020-05-30 11:55:43 +02:00
Jan Kundrát
6b10ed15f8 Restructuring and reorganization of the library structure
Change-Id: I111922113858526c21b848b28c86a80d5c013d65
2020-05-26 19:32:38 +02:00
Jan Kundrát
c8daa5ed8c docs: basic stuff about the module structure
As a bonus, in the Python shell, `help(gnpy)`, `help(gnpy.core)`, etc,
now produce at least some useful information.

Change-Id: I76ade6f2456fcebd3c0a147374815dd245dc4b10
2020-05-26 18:36:30 +02:00
Jan Kundrát
0b03725295 tests: fix import of correct_json_route_list
It got moved in de58d7d7c2, let's import
it directly, not via examples/.

Change-Id: I0c36fcfdb775baac6fafc05ecd41ac39bafd750d
2020-05-23 22:14:52 +02:00
Jan Kundrát
ee5e64408d examples: common code for data loading
This also moves SimParams handling to a single place. As a result,
path_requests_run has just become Raman-aware (to the minimal possible
extent, OK).

Change-Id: I4e31af5c67335963ddab567d304f48a899cd569e
2020-05-23 22:03:23 +02:00
Jan Kundrát
b96ffe6c7b examples: unify exception handling
Change-Id: I7a6ac6ba54c597214bb85ee0dda2d6313b677730
2020-05-23 22:03:23 +02:00
Jan Kundrát
3b981853d4 SimParams: pretty exceptions upon a misconfiguration
Change-Id: I69f47925dd08c492b295b6824618a4595d142954
2020-05-23 22:03:23 +02:00
Jan Kundrát
6fa3ef8df1 params: Remove extra if
`kwargs` is always a dict, so it's still safe to prove individual
elements in there.

Change-Id: I2dea3159764807a2ff6a96d613ff69c68a7ba3ce
2020-05-23 22:03:23 +02:00
Jan Kundrát
1b2b048b47 reorganization: move request processing and request disjunction handling into topology.requests
Change-Id: I14902a2e15cc5fd27530cb294f4f549f6974fd49
2020-05-23 22:03:23 +02:00
Jan Kundrát
94c5281260 flake8: Remove unused variables
Change-Id: I67ff11f2580690f58dbe9909a92f8725a65bd7c7
2020-05-23 21:29:59 +02:00
Jan Kundrát
7da4ec08d8 reorganization: move JSON-specific bits from path_requests_run to tools.json_io
Change-Id: I03cf4e298300387ee5d56251e2e552e111b59f65
2020-05-23 21:28:06 +02:00
Jan Kundrát
2b473d26d3 reorganization: move example plotting into an extra submodule
Change-Id: Ibb7a81f360493ad8c1b7e58303e2e256c64dd755
2020-05-23 20:44:51 +02:00
Jan Kundrát
9e74e8b0a0 De-JSONify gnpy.core
Change-Id: I2657f3174209bef5912c7d0809fee876830ad11c
2020-05-23 20:44:51 +02:00
Jan Kundrát
648039521e reorganization: move all JSON processing into an extra module
We agreed that `gnpy.core` should only contain stuff for propagating
wavelengths. Conceptually, JSON parsing and even instantiating these
network elements from data obtained through JSON is *not* something that
is on the same level -- and this will become more important when we move
into YANG format in future.

Also, instead of former `gnpy.core.equipment.common`, use
`gnpy.tools.json_io._JsonThing`. It is not really an awesome name :),
but I think it sucks less than a thing called "common" which would be no
really longer any "common" in that new file.

Change-Id: Ifd85ea4423d418c14c8fae3d5054c5cb5638d283
2020-05-23 20:44:47 +02:00
Jan Kundrát
24bc023a07 refactoring: reduce amount of Python magic in network building
Similarly to I7ab9f64d7ac2042e8a16d031ba5562a6eb412471, it's better to
be explicit about what's going on. It makes it easier to reason about
the code.

Change-Id: Ic7f4a590567f1f5903222be8dae53521424d8f77
2020-05-23 18:14:41 +02:00
Jan Kundrát
19c2ae7f7a Qualify all usages of classes from gnpy.core.elements
This will be needed by one of the follow-up commits which move JSON
manipulation into a single file. We have to distinguish between "Roadm"
the JSON representation and "Roadm" the network element which propagates
spectrum.

Change-Id: I6888842c57c3a57849fabe75d0ff6f5bbfab426a
2020-05-23 18:11:08 +02:00
Jan Kundrát
9f6894a176 Remove extra indirection -- just instantiate Edfa directly
Change-Id: Ib957063d3333597fd58ba97bb78a187559f42d86
2020-05-23 17:52:39 +02:00
Jan Kundrát
a094568d6e equipment: move NF estimation into science_utils
I envision equipment.py as something which deals exclusively with the
traditional GNPy's JSON-formatted data, so make sure we do not include
any computation in that file.

Change-Id: I8473cccd84243147181a7195ba39fc6c9db3c42f
2020-05-23 16:37:56 +02:00
Jan Kundrát
f728d96d07 equipment: mark internal functions as such
Change-Id: I3cbed25568c0e85685caf3e279f3c3a8e37247f1
2020-05-23 16:37:56 +02:00
Jan Kundrát
6b1fb7061f Move placeholder EDFA NF calculation to where it gets used
I think that equipment.py should be only concerned with constructing
network elements from JSON data.

Change-Id: I777835b02a23b76fb1d40c3a966e72b606e9c205
2020-05-23 16:19:05 +02:00
Jan Kundrát
7ef505f259 Do not perform magic when reading equipment config
I think that being explicit about what gets instantiated is much easier
to read. For one, it enables "find usages of this thing" in IDEs.

The original code was also insecure because it would happily invoke any
function available in the global context with user-supplied data(!).

Change-Id: I7ab9f64d7ac2042e8a16d031ba5562a6eb412471
2020-05-23 16:07:48 +02:00
Jan Kundrát
c8d394348d Tweak error messages a bit
Change-Id: I49af7ea0d40a901cee271f21306288337f6b041f
2020-05-23 16:07:17 +02:00
Jan Kundrát
0daa1c3e8c Rely on pip for missing dependency detection
We have some docs on how to install this, and we rely on the
requirements.txt file for specifying what packages are required. There's
no point in arbitrarily handling the `xlrd` package in a different
manner.

Change-Id: I2355440ada7fd0cc4868aeff5a8956729655c96b
2020-05-23 15:45:10 +02:00
Jan Kundrát
60bafd114d examples: remove unused imports
Change-Id: I5c60243ecf7cdb56bf1060e2630e35ffb5f3548f
2020-05-23 15:33:18 +02:00
Jan Kundrát
11509f5686 requests: simplify ResultElement inheritance
This has been the only place where content from gnyp.tools.service_sheet
was being used outside of tests and examples. It looks like something
which is actually *not* used anywhere (the ResultElement instances are
only ever apended to a list which gets used as-is, so we do not need any
custom comparisons or hashing).

Change-Id: Ib6ddcf55779218d602620e77973d88ad62d0ec7b
2020-05-23 15:23:27 +02:00
Jan Kundrát
785c823fa2 coding style: separate property definitions from each other
While not strictly needed in Python, it's much clearer this way.

Change-Id: I5f8caccde6440cb4032aae5deae577d328834a75
2020-05-23 15:23:27 +02:00
Jan Kundrát
9faf6430a5 reorganization: gnpy/{core => tools}/service_sheet.py
Change-Id: I88559cc718536f222b8ea9829bcc72a425c062ca
2020-05-23 15:23:27 +02:00
Jan Kundrát
01c566a325 reorganization: gnpy/{core => topology}/spectrum_assignment.py
Change-Id: Ic6194ce639dcb2f9419372febe0f2b58473edb38
2020-05-23 15:05:42 +02:00
Jan Kundrát
baa9171315 docs: improve docs markup a bit
Change-Id: Ic1f60dbdfbe08111026cc3bdcd8d56293cb4555d
2020-05-23 14:59:35 +02:00
Jan Kundrát
2e50337f38 docs: remove implementation details from the documentation
Change-Id: Ic5ff20b16cc932953729f9cae2e8e3d96058e92d
2020-05-23 14:59:18 +02:00
Jan Kundrát
8daa298699 Import referenced exceptions
Change-Id: Iba0dc2682edc59e211bd8bbe638ce0bc1f4917fe
2020-05-23 14:58:49 +02:00
Jan Kundrát
76cdd5dc71 parameters: remove unused logger
Change-Id: I2d51081c8e1f7315861c2280ceb92304327c2ac6
2020-05-23 14:30:15 +02:00
Jan Kundrát
07eb2dd13a Refactoring: conversion functions instead of gnpy.core.units.UNITS
The TL;DR behind this patch is that it's better to have a utility
conversion function instead of having multiplier LUT and open code which
implements the conversion.

The FiberParams handling looked fishy -- apparently, it was keeping the
multiplier around, but it was unconditionally setting the units to
meters, anyway. Given that the units were not being preserved anyway
(everything got converted to meters), and that the multipler was not
used anywhere, let's refactor the code to just convert to meters using
our new utility function, and remove the unused argument.

Change-Id: Id886d409a4046f980eed569265baefd97db841bd
2020-05-23 13:50:25 +02:00
Jan Kundrát
04e764d024 FiberParams: print out the configuration upon failure
Apparently it's sometimes not obvious where the input data come from
(see next commit), so let's show the data which caused this excpetion to
the user.

Change-Id: Id333903a0549c4ef5dc37c2f6ff340bd357279ea
2020-05-23 13:50:25 +02:00
Jan Kundrát
05ccb14e5d Remove unused gnpy.core.execute submodule
Change-Id: I3972f8321547bc596c018fa04232edfa23b97581
2020-05-23 13:30:19 +02:00
Jan Kundrát
f60d035e66 Ensure that automatic_fmax gets used and not reinvented
Change-Id: I1f77f47eec3a3f0e6f22a073a98edca20958d321
2020-05-23 13:13:22 +02:00
Jan Kundrát
0823f8de46 Move automatic_nch, automatic_fmax into utils
I think that gnpy/core/equipment.py should contain only stuff which
prepares the equipment_config, not anything "lower level" that is reused
from other places.

Change-Id: I0cd593fd3e5558178ddd0ad8fff5c596e022894a
2020-05-23 13:08:24 +02:00
Jan Kundrát
a453c57996 document and test automatic_nch and automatic_fmax
Change-Id: Ib6a5e5f8278456725c810a9f7f0c7f5cafd78bd2
2020-05-23 13:03:23 +02:00
Jan Kundrát
2f84bb5286 Merge "Zuul: test with Python 3.6 as well" into develop 2020-05-23 09:03:20 +00:00
Jan Kundrát
7b8e68aea9 Zuul: test with Python 3.6 as well
We're saying that Python 3.6 is the minimal version that we support, so
let's make sure there's CI coverage for that.

I also tried enabling Python 3.8, but somehow the build of Pandas
failed. I don't feel like debugging a Pandas build failure today, so
let's postpone this thing until later. Just having 3.6 is a net
improvement, and we can play with even newer Python later -- and perhaps
on a newer distro, anyway.

Change-Id: I28a8c282225b7070ed3dddba56cccc8def313a77
2020-05-23 10:24:33 +02:00
Jan Kundrát
8d553a255f Merge "docs: brand the generated docs with our shiny logo" into develop 2020-05-22 13:04:13 +00:00
Jan Kundrát
2766e37438 CI: do not clutter up Zuul's failure display with the coverage report
Just ask pytest-gcov to not generate any test report. The coverage data
are still generated and will be used in later steps of the build
pipeline.

Change-Id: Ic5bb6a48e2abf6ee52e1c2650727ce4170611171
2020-05-21 12:52:19 +02:00
Jan Kundrát
f56e64410b docs: brand the generated docs with our shiny logo
Image source: I took the existing banner, cropped it and resized to
200px width as per the theme docs.

Change-Id: Ic79b6164d557298746fe878de31ee0a9b0d93923
2020-05-19 18:11:29 +02:00
Jan Kundrát
15ea7218e9 Merge gnpy.core.node.Node into gnpy.core.elements
That class is an internal implementation detail, so mark it with a
leading underscore as per Python idioms.

Also, tweak the docs so that there's less duplicate information and
more cross-references.

Change-Id: Ieb1c8034ab5b442032396d7c4bbd0a697c7eb492
2020-05-19 17:29:11 +02:00
Jan Kundrát
db28011c61 coding style: manual tweaks
This mainly reverts some auto-fix-ups done in
I2f0fca5aa1314f9bb546a3e6dc712a42580cd562 which do not make that much
sense. By reverting them by hand, it's (hopefully) easy to see what is
just a tool work and what is an opinionated preference.

Change-Id: I6cb479e34b552fadc85c41b4b06b24e60c87b4a3
2020-05-19 13:59:56 +02:00
Jan Kundrát
faccc23018 Always use our ansi_escapes module
Change-Id: I27bac671caa7544f51e7935ee120c9fcd6c942a7
2020-05-19 13:59:56 +02:00
Jan Kundrát
49514c0c70 flake8: fix F401 (unused imports)
Change-Id: I6f79f3a4c071b332e45033f4189a2af6c66a6e05
2020-05-19 13:45:05 +02:00
Jan Kundrát
0b1557fdf1 flake8: fix F841 (local variable is assigned to but never used)
Change-Id: Ic79d0189bf4e97b953604edd0a6932f28c71a071
2020-05-19 13:30:04 +02:00
Jan Kundrát
46f89aa770 coding style: autopep8 in an aggressive mode (-aaaaaaaaaa)
I decided to skip the following chunk of the diff because I think that
it would actually made the code a bit harder to read:

diff --git gnpy/core/service_sheet.py gnpy/core/service_sheet.py
index 9965840..9834111 100644
--- gnpy/core/service_sheet.py
+++ gnpy/core/service_sheet.py
@@ -41,8 +41,22 @@ logger = getLogger(__name__)

 class Request(namedtuple('Request', 'request_id source destination trx_type mode \
     spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
-    def __new__(cls, request_id, source, destination, trx_type,  mode=None, spacing=None, power=None, nb_channel=None, disjoint_from='',  nodes_list=None, is_loose='', path_bandwidth=None):
-        return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from,  nodes_list, is_loose, path_bandwidth)
+    def __new__(
+            cls,
+            request_id,
+            source,
+            destination,
+            trx_type,
+            mode=None,
+            spacing=None,
+            power=None,
+            nb_channel=None,
+            disjoint_from='',
+            nodes_list=None,
+            is_loose='',
+            path_bandwidth=None):
+        return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing,
+                               power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)

 # Type for output data:  // from dutc

diff --git tests/test_automaticmodefeature.py tests/test_automaticmodefeature.py
index 0e5f633..5ba5881 100644
--- tests/test_automaticmodefeature.py
+++ tests/test_automaticmodefeature.py
@@ -32,7 +32,26 @@ eqpt_library_name = Path(__file__).parent.parent / 'tests/data/eqpt_config.json'
 @pytest.mark.parametrize("net", [network_file_name])
 @pytest.mark.parametrize("eqpt", [eqpt_library_name])
 @pytest.mark.parametrize("serv", [service_file_name])
-@pytest.mark.parametrize("expected_mode", [['16QAM', 'PS_SP64_1', 'PS_SP64_1', 'PS_SP64_1', 'mode 2 - fake', 'mode 2', 'PS_SP64_1', 'mode 3', 'PS_SP64_1', 'PS_SP64_1', '16QAM', 'mode 1', 'PS_SP64_1', 'PS_SP64_1', 'mode 1', 'mode 2', 'mode 1', 'mode 2', 'nok']])
+@pytest.mark.parametrize("expected_mode",
+                         [['16QAM',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'mode 2 - fake',
+                           'mode 2',
+                           'PS_SP64_1',
+                           'mode 3',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           '16QAM',
+                           'mode 1',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'mode 1',
+                           'mode 2',
+                           'mode 1',
+                           'mode 2',
+                           'nok']])
 def test_automaticmodefeature(net, eqpt, serv, expected_mode):
     equipment = load_equipment(eqpt)
     network = load_network(net, equipment)

Change-Id: I522c45c079b3a9540568657e2ae0a4bfc5fb1272
2020-05-19 12:53:11 +02:00
Jan Kundrát
3548ed74e2 coding style: autopep --in-place --recursive --jobs 4 --max-line-length 120 gnpy/ tests/
Change-Id: I2f0fca5aa1314f9bb546a3e6dc712a42580cd562
2020-05-19 12:40:00 +02:00
Jan Kundrát
145653df6e reorganization: gnpy/{core => topology}/request.py
Change-Id: Ib399549479b56634c681930aa444b657e5f58ca7
2020-05-19 11:56:02 +02:00
Jan Kundrát
c7589e0bca linters: remove unused variables
Change-Id: I16cb9fab7996efcd01eec1f5dee7be12adae43c2
2020-05-19 11:55:26 +02:00
Jan Kundrát
531810cc85 Remove unused class import
Change-Id: I3e0b7174f6ff60a632bbcde56178499755b8b7df
2020-05-19 11:55:17 +02:00
Jan Kundrát
63a6256b5e pep8: rename classes to use CamelCase
Change-Id: Ia696e05b72f1bc5feb570996f492042dafab262d
2020-05-19 11:54:27 +02:00
Jan Kundrát
c3febb6db4 gnpy/core/request.py: autopep8
Change-Id: I8790f67e6d7993b50d6fc0295f3a8a88439ac9ae
2020-05-19 11:54:27 +02:00
Jan Kundrát
8b1d8b3479 reorganization: XLS conversion goes to gnpy.tools
Change-Id: Ibbaddacd24a2d0f6f6d98fdc30d57da3be188338
2020-05-19 11:54:23 +02:00
Jan Kundrát
3168603908 gnpy/core/convert.py: coding style fixes
Change-Id: I760ff33813e916b33cb7cb3f6cef470200587acf
2020-05-19 11:49:55 +02:00
Jan Kundrát
a2128227bd gnpy/core/convert.py: run autopep8
Change-Id: Ib1a70a4f7bf8ab36aa1ec9265e2afaf69ca59d94
2020-05-19 11:49:50 +02:00
Jan Kundrát
376826b3ae Merge "Remove debug dumping of "city mapping"" into develop 2020-05-19 09:32:43 +00:00
Jan Kundrát
4b258cdf2e Merge changes from topic "namespaces-and-modules" into develop
* changes:
  docs: emphasise the API reference over bits which are duplicated in the README
  docs: remove list of authors
2020-05-19 08:01:41 +00:00
Jonas Mårtensson
fbdd132a3d Support propagation of single channel
It always seemed like a strange restriction to not allow this.

Change-Id: Ice3ed3ecc08f42b6ef8b74d4a6bc3b1794ff078a
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-05-14 21:14:12 +02:00
Jan Kundrát
eebcebb33d Remove extra elements from default EDFA parameters
Since Ia8c8d734c7045062ce123360f4a1432490384118, the lists no longer
have to have the same length.

Change-Id: I43ff10defa414cde0d6ff26a4d238b8680e39899
2020-05-14 16:43:51 +02:00
Jonas Mårtensson
20152036ff Allow different lengths of EDFA config lists
Currently interpolation of nf_ripple or gain_ripple fails if the length
of the input list is different from the dgt input list since the amplier
frequencies used for interpolation are calculated based on the length of
the dgt input list. There seems to be no good reason for this restriction
so I propose to calculate amplifier frequencies separately for the
different inputs. This also allows to specify a flat ripple with just one
number and a flat dgt with two numbers.

Change-Id: Ia8c8d734c7045062ce123360f4a1432490384118
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-05-14 15:30:57 +02:00
Jan Kundrát
2a477071a0 docs: emphasise the API reference over bits which are duplicated in the README
Also make sure that all modules are covered. It seems that there's no
automatic way for doing this, aargh. On the other hand, there's
apparently no need to repeat all the Sphinx markup blurb, and even
sectioning works nicely (read, "as I expect it to work") now :). I think
that it's still necessary to keep these "intermediate files" that only
trigger package-level and module-level autodocs, but hey, I can live
with this.

Change-Id: I705e0054cd7cd4350171770795d69f5c15c226d6
2020-05-07 19:54:55 +02:00
Jan Kundrát
f02d11e8bc docs: remove list of authors
The documentation is something which our users see as one of the first
things when they go and try GNPy. With all respect to people who have
contributed over the years, there are more important information to
convey to a first-time user instead of a list of authors.

Change-Id: Ibe5f6477f9736b9ab71effcf0eccef7c7fdfdde5
2020-05-07 19:05:50 +02:00
Jan Kundrát
0d542f22a7 Remove debug dumping of "city mapping"
This JSON file is never used by the rest of the code. Let's get rid of
one of these files which are put into the source code directory during
unit test execution.

Also remove other dead code; thanks to Esther for catching this.

Change-Id: I30a4e7edcf638162ec438fbf7f00d26d78944ac3
2020-05-05 19:20:00 +02:00
Jan Kundrát
5af195bd2b Remove unused imports
Change-Id: I66174048a9eaab0f79ba4c3b1d31ef4dc9c2009b
2020-04-30 17:30:55 +02:00
Jan Kundrát
7ab93e7cd9 tests: frequency to wavelength
I decided to keep it around because I know that some people would like
to see those nanometers. Let's make sure it works.

Change-Id: Ib279cc8380a77f478da7a2bbc1e045a718446404
2020-04-30 17:30:55 +02:00
Jan Kundrát
0aec47ddeb tests: remove dead code
Change-Id: Id8bc0cae7d91b2a80032890b2e30c70be566d052
2020-04-30 17:30:55 +02:00
Jan Kundrát
9a54dbab43 Remove unused functions
Given that everything else just uses these constants as imported from
numpy, there's no point keeping these wrappers around.

Change-Id: I0e19e05f40dc79d8005e915cf3ffb5e36328421a
2020-04-30 17:30:55 +02:00
Jan Kundrát
fc03be8bbe tests: use doctest instead of an explicit runner
...and expand the coverage a wee bit while we're at it as well.

Change-Id: I0de8445dc29f46e5f238bff0ca0e1f63fe19712d
2020-04-30 17:30:55 +02:00
Jan Kundrát
32f10a4507 tests: show how much *test* code is actually executed
Change-Id: Iba05c8b212b7217fe3d45a50f3d24f017ab09a74
2020-04-30 17:30:55 +02:00
Jan Kundrát
04544d41f6 Merge "Fix #353 - String representation of network elements" into develop 2020-04-30 15:29:07 +00:00
Jonas Mårtensson
c87be89e07 Fix #353 - String representation of network elements
Currently the string representation of some elements in elements.py refer to parameters that are not assigned until the propagate function runs.
Printing an element or trying to access its string representation before propagation therefore raises a TypeError.

This patch should fix the issue.

Change-Id: I29962f3c00e1f4fb7935535d4514a9579bc0c918
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-04-29 10:29:44 +02:00
Jan Kundrát
efa05f0653 Merge "Find paths with include nodes constraint" into develop 2020-04-28 16:42:10 +00:00
Jan Kundrát
4cf4db9bc2 Merge changes If9ebdfad,I9bc7e5bb,Ia7cf2026 into develop
* changes:
  include all 'no' Capital/non capital combination in test
  small refactor
  Enable ILA and FUSED type in routing constraints with xls input
2020-04-27 11:10:24 +00:00
Jonas Mårtensson
16434c5737 Find paths with include nodes constraint
The compute_constrained_path function in request.py handles include nodes constraint by first finding all simple paths shorter
then a cutoff length and then checking if any of the paths meet the constraint. The cutoff is currently set to 120 which is
too small for large networks. For the CORONET_CONUS_Topology.json topology, for example, the shortest path between some node
pairs is longer than the cutoff.

This is a small proposed change to first compute the length of the shortest path and then set the cutoff to at least 20% (a
conservative number) higher than this. I also propose to increase the minimum cutoff from 120 to 150.

Change-Id: I97ff2915fb38e4681bd64d60f2cd6dfd422afd4f
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-04-24 16:22:20 +02:00
EstherLerouzic
14ee9c9a91 Removing exit(1) and using exception instead
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I13bb8c87a7f764bda6e13f233c7b70909b9ee035
2020-04-24 10:41:17 +00:00
Jan Kundrát
7cfc4bd1ec linters: activate more checkers
"naming" with the "N" prefix looks reasonable, and it looks like my venv
has already had mccabe for that cyclomatic complexity. Let's get these
in.

Change-Id: Ie0cf4d8c20486f0f561db01903b3aa93cbebdb12
2020-04-24 12:22:32 +02:00
Jan Kundrát
9d55cde50d linters: Increase maximum line length
The code is already using long lines, so I think we can avoid a
discussion whether to raise this limit, and instead we can focus on
another discussion :) on what that limit should be.

Random googling suggests that GitHub's code renderer operates at 119
chars per line. A random style guide mentions going to 100 or 120. Prior
to this change, there has been 712 violations of PEP8 check E501.
Picking 119 would bring us down to 21 violations, and using 120 reduces
it further to 16. I think that 120 is a cuter number than 119, so 120 it
is unless someone objects hard enough to propose another cutoff in a
follow-up patch.

Change-Id: I57f48fcb6846fea35223daac91aa2e8c7afabc63
2020-04-24 12:21:18 +02:00
Jan Kundrát
72183c24da Merge changes from topics "coverage", "examples-via-tests" into develop
* changes:
  CI: run flake8 for coding style nagging in changes
  Fix syntax of the bandit config file
  Upgrade pytest, and list it as a test-only dependency
  CI: Activate coverage diffing
  tests: simplify tox setup
  CI: Run both non-coverage and coverage build/test jobs
  tests: Check if example code provides exact same output
  tests: Run examples via pytest
  Show more details upon a test failure
2020-04-24 09:32:03 +00:00
Jan Kundrát
4d1a628488 CI: run flake8 for coding style nagging in changes
This combines a few pieces of magic:

- Zuul merges speculatively, but in the git trees that the build VMs
see, the "current branch" is always the state as-if the currently tested
change was merged, while origin/"current branch" is the result of all
previous changes, if any, applied to the current tip of the target
branch.

- flake8 has a --diff option which appears to work correctly, *but* it
reports on all lines that are present in its input, including the
context -> hence the -U0 option.

Change-Id: I28403ff0132fac0b52696c82306732a4f81fa66a
See-also: https://gitlab.com/pycqa/flake8/issues/58
2020-04-24 00:14:02 +02:00
Jan Kundrát
fdeaf75361 Fix syntax of the bandit config file
Unlike codacy, it seems that bandit the CLI tool complains about these:

  Unable to parse config file: File contains no section headers.
  file: '/home/jkt/work/TIP/oopt-gnpy/.bandit', line: 1
  "skips: ['B101']\n"

Change-Id: Iab93052fd8aaf1754571a3c66796cfe3026f6a63
2020-04-23 18:38:36 +02:00
Esther Le Rouzic
5695fafac6 Merge "Adding a set of tests on spectrum_assignment functions" into develop 2020-04-23 13:17:31 +00:00
Jan Kundrát
0fe1d195a3 Merge "Update Excel userguide" into develop 2020-04-23 12:53:40 +00:00
EstherLerouzic
ed1aa0aa03 Update Excel userguide
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I9ef3d26c8363526d14fb00a17c31176925488abd
2020-04-23 14:45:39 +02:00
EstherLerouzic
3fc024f5ba include all 'no' Capital/non capital combination in test
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: If9ebdfad62d10b63856652d094ef459d68973f7c
2020-04-23 12:43:11 +00:00
EstherLerouzic
4f8177908f small refactor
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I9bc7e5bb171864ee385fb2e59631943f2a65d235
2020-04-23 12:42:59 +00:00
EstherLerouzic
ad64595d75 Adding a set of tests on spectrum_assignment functions
- test oms building:
  o there should not be roadm or transceivers between end points except endpoints
  o the element oms is must correspond to the correct oms id
  o the element uid in the oms must match the id list

- test the alignment function
  this function enables to have different grid end points in different parts
  of the network. The grid used in the network must cover all these grids, so
  some bit stuffing is needed on the oms that have different sizes.
  The test checks that
    o min and max attribute are correctly updated
    o min and max n or freq values are consistent and consistent with bitmap
    o alignment is correct

- test that the assignment of n and m values is correct
  o check that assign_spectrum has returned an error code if the requested
    assignment is not possible and that the bitmap has not been set to 0
  o check that the bitmap sum works correctly when assignment is feasible
    and that all range of spectrum has been set to zero. eg:
    [1 1 0 0 0 0 1 1 1 1 1 1] and [1 1 1 1 1 0 0 0 0 1 1] must be:
    [1 1 0 0 0 0 0 0 0 0 1 1]

- Check that spectrum assignment of 13,7 is correct in Hz

    This example has been extracted from ITU-T G694.1
    expected value in Hz for 13,7 is 193137500000000.0,193225000000000.0 in Hz
    see fig I.3 of this document https://www.itu.int/rec/T-REC-G.694.1-201202-I/fr

- test assignment limits

  o verify that inconsistent values raise error: ie with defined fmin fmax
    n and m have limited values.
    combine valid and non valid data for n and m
  o verify that Bitmap created with a 0/1 list is consistent with fmin,
    fmax, grid and guard band

- Test with a path configuration

  o loop on assignment on a given path. assignment should be OK until n = 96
    at that value the assignment is no more feasible (exceed grid) and the selection
    function should return None value for all center_n, startn and stopn
  o select an arbitrary request and try to assign 1 slot more than the whole
    spectrum or exactly the whole spectrum and check that function correctly
    return None or not None values

- Test assignment with reversed path

  o add data and requests fixtures to reduce test time
  o test that if spectrum is assigned on one direction it is also
    assigned on reversed direction (bitmaps must be identical)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Co-authored-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
Change-Id: I96dd15452cc2e59086d4022ec4b026be845f4355
2020-04-23 14:36:09 +02:00
Jan Kundrát
80133e9521 Merge "Better explain the source of error in exception message" into develop 2020-04-23 12:22:05 +00:00
EstherLerouzic
de58d7d7c2 Enable ILA and FUSED type in routing constraints with xls input
- builds correspondance dicts between input name from excel
  and names created with convert.py and autodesign in network.py
- correct the corresp_name dicts according to the effective
  network autodesign. This supports the case of fiber splitting
  and of fused elements
- include the case of parrallel links with only one hop
- interpret the node list constraint given by the user with the dict
- filter the constraints that are not applicable
- add tests for constraints
- correct equipment sheet of mesh_example_topologyv2.xls: morlaix and
  loudeac should not appear in node A column since they are fused

ILA and FUSED constraints must be filled with the next node
information in order to avoid confusion on the direction.
for example
     eg    a----b-----c
           |    |     |
           i    j     k
           |    |     |
           e----f-----g

a constraint 'j' given for service i to k leads to 2 possible direction:
i-a-b-j-f-g-k
i-e-f-j-b-c-k
the user must indicate the chossen direction. This ambiguity does not
exist with network input in json format (names are unique).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ia7cf2026e569c8b566459791fc89546b91fb117c
2020-04-23 11:15:37 +02:00
Jan Kundrát
491a05c5a7 Upgrade pytest, and list it as a test-only dependency
Change-Id: Id1cbfb88d3034fc01f302af53bbd689afe54ffa5
2020-04-22 14:53:48 +02:00
Jan Kundrát
1d791aa295 CI: Activate coverage diffing
Depends-on: https://review.gerrithub.io/c/Telecominfraproject/oopt-zuul-jobs/+/489964
Change-Id: Icce2c2b11f4253e2400dcfe0055373f9d6bc7d14
2020-04-22 13:18:46 +02:00
Jan Kundrát
58921bc346 tests: simplify tox setup
I was not happy with duplicating the "test command" section among
several tox test environments. I was not successful when using just
`.coveragerc` or via `addopts` in `[tool.pytest]`, but these
conditionals appear to work.

Also replace `pip install -e .` or `python setup.py develop` with tox'
native `usedevelop`.

Change-Id: I4986a3d14f38b7f6ed9b6a04f773eb222a53a827
2020-04-22 13:18:46 +02:00
Jan Kundrát
ab6a91692b CI: Run both non-coverage and coverage build/test jobs
Change-Id: Ia58112b745a0b34d2dcef782ad729f165629eb22
2020-04-22 13:18:46 +02:00
Jan Kundrát
3b45968799 tests: Check if example code provides exact same output
Change-Id: I5938f85337e4254092683dadc806a0a419cb2a04
2020-04-22 13:18:46 +02:00
Jan Kundrát
c7d69b9a99 tests: Run examples via pytest
This will make it simpler to update coverage info. The pytest-cov plugin
that we're already using apparently makes this behavior supported, nice.

Change-Id: Ieafc0da99a8c325f5f2286ed11e66069e244e43b
2020-04-22 13:18:46 +02:00
Jan Kundrát
1eeed78430 Show more details upon a test failure
With -vv, we can at least have access to the full output -- if not in
its prettiest form.

Change-Id: I950bb4c2f59a9f582e53e48c78458bb47ab7d832
2020-04-22 13:18:06 +02:00
Jan Kundrát
1cdafbae0f docs: explain what the roll_off parameter means
Co-authored-by: Alessio Ferrari <alessio.ferrari@polito.it>

Change-Id: I4930dc5fdd9d4f60cef753f2d2e9674dce8aeb52
2020-04-21 21:27:14 +02:00
EstherLerouzic
47b4f87bc5 Better explain the source of error in exception message
If user forgets to fill in the path_bandwidth, error message was
not explicit enough to help to correct.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I776e95b6f859c3b643786b6e802f3c0bcbc1de76
2020-04-06 11:15:36 +02:00
Jan Kundrát
e24b766cdc Merge "Add an explanation for delta_p in excel user guide" into develop 2020-04-02 12:02:05 +00:00
Jan Kundrát
51fb6bd68e Merge "Correct the unit of gamma in Readme file" into develop 2020-03-30 13:37:03 +00:00
EstherLerouzic
7e405a0514 Correct the unit of gamma in Readme file
The units can be found in several papers from the documentation
for example
A. Carena, G. Bosco, V. Curri, P. Poggiolini, M. Tapia Taiba, and
F. Forghieri. Statistical characterization of PM-QPSK signals after
propagation in uncompensated fiber links. In European Conference on
Optical Communications, 2010, 1–3. IEEE, 2010-09.
URL: http://ieeexplore.ieee.org/document/5621509/,
doi:10.1109/ECOC.2010.5621509.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: If70b57a28ee108c4b8dbe341075e12169efbe14c
2020-03-30 15:29:29 +02:00
EstherLerouzic
194a13e607 Add an explanation for delta_p in excel user guide
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I0e3380abd494dd99a30bee00b55ec427d48c45ef
2020-03-30 15:22:43 +02:00
Jan Kundrát
856f07e707 Merge "Zuul CI: remove manual workaround for Python3 and TOX" into develop 2020-03-30 13:19:01 +00:00
Jan Kundrát
4582aed895 Merge "docs: Fix all sphinx warnings" into develop 2020-03-30 13:17:44 +00:00
Jan Kundrát
734b6dfef4 Zuul CI: remove manual workaround for Python3 and TOX
Now that https://review.opendev.org/712804 has been merged, we can just
use upstream jobs without that extra workaround.

Change-Id: Ia113caaea41bdd440dcb615c02ad95eba6da7543
2020-03-27 16:06:06 +01:00
Jan Kundrát
da6e4f33a4 Merge "Fix dependency installation with PBR" into develop 2020-03-27 15:04:10 +00:00
Jan Kundrát
5a1e3f30b3 docs: Fix all sphinx warnings
...and also enforce a warning-free build within the CI.

Change-Id: Ia406a0a1ca2e89ceaa0288ae82128fa9427fe066
2020-03-27 15:07:19 +01:00
Jan Kundrát
094af16792 CI: build documentation, too
I've bumped Sphinx because this might be needed for proper support of
pbr-provided version numbers. In the end I also had to re-run `pip
install -e .` locally, quite likely due to a different format of version
numbers now that pbr is being used, but let's make sure that we're using
a version of Sphinx that is now known to work.

Change-Id: I451b4b17bb8097eb7f2f0f0956cc1c956a531828
2020-03-27 12:46:53 +00:00
Jan Kundrát
9474ff17de Fix dependency installation with PBR
Ouch, this one hurts. It turns out that PBR-based setup does *not*
install all dependencies when running `python setup.py install`. On the
other hand, running any of:

- `pip install .`
- `pip install --editable .`
- `python setup.py develop`

...makes everything work. So, let's change the instructions and all the
build scripts (including the Docker file) to make sure this thing still
works. Sorry for noise.

This is a significant change, it means that people will have "in-place
installations" of GNPy. Changes to their git checkouts, if used, will
apply, etc. I think this is actually a good change.

fixes #287

TL;DR: package installation with Python is still a mess.

Change-Id: I422b889b599e0b7cae36f160d1548cef7fb50a4e
2020-03-27 13:41:55 +01:00
Jan Kundrát
c675f5bd38 docs: do not reference a non-existing directory
Change-Id: I197956d3b55f0ebcf20cee54d5f2887bbd821fa3
2020-03-25 19:34:42 +01:00
Jan Kundrát
2556658e68 CI: build and test this under Python 3.7 and Vexxhost's Zuul
Change-Id: Ifa849ab8182596dc0285c18365ead806ea9d0bc9
Co-authored-by: Mohammed Naser <mnaser@vexxhost.com>
2020-03-25 19:34:42 +01:00
Jan Kundrát
4d5d10935a Convert to pbr setup
It turns out that our current setup does not really support the `sdist`
Python packaging step. I'm trying to increase automation, both for
making releases for upload to pypi.org, and also for CI via Zuul. As it
turns out, Zuul comes with a set of predefined jobs which -- by default
-- use `tox`, and tox was having troubles dealing with our `setup.py`
because it also assumes that the `sdist` step works. It is also supposed
to:

- fix version number duplication in `setup.py`,
- fix version number duplication in Sphinx docs,
- prevent the need to write a `MANIFEST.in` manually.

TL;DR: insetad of having to deal with a ton of other creative
boilerplate, use tools for lazy people like me, and PBR ("Python Build
Reasonableness") appears to check the marks here.

Change-Id: I27c36c4f6b0e76857d16f7819ec237e9b811476a
2020-03-25 19:34:42 +01:00
Jan Kundrát
eb87e36781 Configuration for git-review
We're still primarily a GitHub project, so it is necessary to add a
pointer for git-review to point to GerritHub (which is our Gerrit
hosting site).

See-also: https://docs.openstack.org/infra/git-review/
Change-Id: I6fe8da2358ae50340fbe384b038ab097ab229b59
2020-03-25 12:50:23 +01:00
Jan Kundrát
14d8d793f3 Fix test failure due to mishandled merge
In commit 80eced8, the structure of parameters to `elements.Fiber` was
changed. Options such as fiber length are now passed in via
`self.parameters.*` instead of `self.*`. Commit 639b379 which fixed a
test failure precedes that change, and when we merge both as I did in
commit bc4b664, the test no longer works. My bad. On the other hand,
this will be caught by trunk gating which is something that Zuul can do,
and therefore something that we'll have in our upcoming CI, yay!

Fixes: bc4b664
Change-Id: Ifcd8f0bf01e9d91dbef3da1aa7f56f89132d6f48
2020-03-24 19:50:24 +01:00
Jan Kundrát
bc4b6642a0 Merge pull request #340 from Orange-OpenSource/correction_test_amplifier
Correction test amplifier

Fixes #324, #321
2020-03-24 16:59:30 +01:00
Jan Kundrát
fe2b39ee3b Merge pull request #342 from jktjkt/pr-337-339
Cleaned-up PRs #337 and #339 which are related to parsing of `STRICT` path requirements in service requests.
2020-03-24 13:51:59 +01:00
EstherLerouzic
d56c91ca7f small linter fixes
(Jan: cherry-picked relevant parts of c6be21f36f)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ib84f452a15a20a88ea186ceef64712eb967d46a7
2020-03-24 12:07:33 +01:00
EstherLerouzic
25819fadf5 refactor test and add test on constraints
(Jan: cherry-picked relevant changes from commit
44e8936f04d6eb334cea2e5643d8ddceb6a0a5a8)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Signed-off-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
Change-Id: Ibeeaa42b35e23075618c8543a857f47fac48df27
2020-03-24 12:05:56 +01:00
EstherLerouzic
0cacb8851d Fix node_list and "STRICT" checking
The check on node_list constraint has to be be done on all elements except the last one,
and the last one is always a "STRICT" value. this means that if no constraints are given
this list is equals to ["STRICT"]. Previous code was using len("STRICT") ie 6 instead of
len(["STRICT"]) ie 1.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-03-24 12:00:20 +01:00
Jan Kundrát
14f98836ed Merge pull request #341 from ojnas/import-xlsx
Enable import from .xlsx files
2020-03-23 11:17:34 +01:00
Jan Kundrát
82ce3384ba tests: Use XLSX for the CORONET topology
Now that XLSX is officially supported, it should be tested as well.

Change-Id: I9286d3cb56950d554582d2eaf8153da5ffdd771a
2020-03-21 13:19:24 +01:00
Jan Kundrát
1f1877f7a9 tests: do not hardcode file suffix lengths
This will become useful for XLS -> XLSX conversion.

Change-Id: I025e4c24d00526d3bb48c23dcbdc82a65be9a477
2020-03-21 13:19:24 +01:00
Jonas Mårtensson
db26ce07db Enable import from .xlsx files
Fixes issue #336
2020-03-20 18:17:02 +01:00
Jan Kundrát
e08ae9c959 Merge PR #329 into develop
This is a continuation from #319. The singleton-ish interface has not
been changed yet.

Change-Id: I93c8f1145561184f6e91f7e8f7debd3b48936205
2020-03-19 17:15:14 +01:00
EstherLerouzic
2de1b5567a updating test_compare_nf_models to correctly compare variable gain with polynomial
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-02-24 09:53:08 +01:00
EstherLerouzic
639b379a5b Updating ase test to handle the variable gain setup
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-02-21 16:23:11 +01:00
Mohammed Naser
0465397b1d Add empty Zuul configuration
In order for Zuul to start self-testing, it must actually have
a project entry listed.  This is the initial commit.

Change-Id: I017cb036f3191e46446d82b96a6acd35f2adcd0e
2020-01-28 20:20:41 +01:00
Jan Kundrát
d3ec39d506 Bump version to 2.1
Change-Id: I9f27fc87c5ca43e473fe212d2ee3dad7b12d4061
2020-01-15 00:09:48 +01:00
Jan Kundrát
bfe68a5948 Merge branch 'develop'
Change-Id: If7860b243cb504613a7fffafad2d601510000af7
2020-01-15 00:09:18 +01:00
Esther Le Rouzic
2ea3363613 Merge pull request #331 from Orange-OpenSource/pr_test
update version setup.py
2020-01-14 13:02:32 +00:00
DELFOUR Emmanuelle TGI/DATA-IA
89cce6e6a3 update version setup.py 2020-01-14 10:22:21 +01:00
Jan Kundrát
0f10ac706c Merge pull request #267 from Orange-OpenSource/capacity_planning-part2
This brings in the concept of OMS (so far created implicitly based on the input topology), spectrum assignment, etc.
2020-01-13 09:30:45 +00:00
EstherLerouzic
66d26f0ffa add the missing else to handle non first_fit policy
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:46:24 +00:00
EstherLerouzic
3c96914482 replace todo with TODO
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:41:40 +00:00
EstherLerouzic
61b1e73362 remove unused testing piece of code
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:40:18 +00:00
Jan Kundrát
03435079cc refactoring: simplify code flow a bit
- no need to explicitly log exceptions that are about to be raised
- kill some extra commented-out prints

Change-Id: I73bae5a2456644c4d4ff45bd984d44c27bc22ec4
2020-01-10 18:17:44 +00:00
Jan Kundrát
6661907c1d fix typos
Change-Id: I0a4d2c14c5e873dd521736525bb9b10c9b70975b
2020-01-10 18:17:17 +00:00
Jan Kundrát
fe811f725c tests: use native pytest features for exception handling
Using `with pytest.raises` is better than open coding the equivalent
feature. Similarly, when a block is not expected to raise an exception,
let's just let it run outside of a `try` block and rely on the test
framework to report a possible failure when hitting an unhandled
exception.

Change-Id: Icb1bb83e649733b56fcdc9168cabf88c9cf8d478
2020-01-10 18:16:53 +00:00
AndreaDAmico
80eced85ec Refactoring with some incompatible changes
Please be advised that there were incompatible changes in the Raman
options, including a `s/phase_shift_tollerance/phase_shift_tolerance/`.

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
Co-authored-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
2019-12-17 11:51:09 +01:00
AndreaDAmico
2960d307fa Unrelated changes
linter changes

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-12-17 11:13:00 +01:00
AndreaDAmico
c41cddfff5 Introduce new parameter structure for all Fibers
The Polito team wanted to have a single object with all parameters
required for each `Fiber` instance -- apparently for better code
readability.

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: Alessio Ferrari <alessio.ferrari@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
Signed-off-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
2019-12-17 11:12:52 +01:00
Jan Kundrát
8598e6591f Merge pull request #304 from Orange-OpenSource/user_error_catching_improvment
Give more useful comment for user to correct topology
2019-12-12 16:11:54 +00:00
EstherLerouzic
5e2259062c Give more useful comment for user to correct topology
when a node name used in link does not correspond to listed
node names, Bad link msg was thrown without info on which link
causes the problem.
This change gives additional info on link and catch the user error
more cleanly.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-12-12 15:55:33 +00:00
Alessio Ferrari
d483802a86 Edn-to-end test for Raman amplification 2019-12-12 12:52:59 +00:00
Jan Kundrát
9a0eece69c Merge pull request #322 from jktjkt/master
docs: Link to the updated meeting event
2019-12-09 20:46:06 +00:00
Jan Kundrát
1657bfd05f build: workaround Sphinx doc build failure
Upstream bugreport at sphinx-doc/sphinx#6887 suggests that this is due
to pip picking up the 0.16b0.dev0 pre-release...

Bug: https://github.com/sphinx-doc/sphinx/issues/6887
2019-12-09 16:05:56 +01:00
Jan Kundrát
49bf558916 docs: Link to the updated meeting event 2019-12-09 14:31:59 +01:00
Jan Kundrát
99f44a597b Merge remote-tracking branch 'origin/develop' 2019-11-13 19:59:35 +01:00
Jan Kundrát
a21f3fe6ee Merge pull request #312 from jktjkt/fixes
Small refactoring for itufs/utifl
2019-10-16 05:38:37 +00:00
Jan Kundrát
0ccbb2960c Merge pull request #277 from Orange-OpenSource/capacity_planning-part1
Capacity planning part1
2019-10-16 05:36:56 +00:00
EstherLerouzic
c577a75725 second set of modification due to codacy report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8827e0cf6f Codacy report: minor changes (trailing spaces, ...)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
b0012fe399 SA: process also the case when pth is empty
TODO: a path containing only transceivers and no roadms leads to no oms
this has not been properly taken into account. (single link w/o ROADM
has no SA complexity and is out of the scope)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
31e634615b Complete test on response comparison
If request is bidir and 'z-a-path-metric' is missing, raises an error
If present, should not raise an error

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8300a55e39 Update test file with correct SA (due to bidir correction)
reversed path must be taken into account for spectrum assignment

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
5b939bc57a Improve compare_response function
remove some prints, indicate which from actual or expected key
is printed.
preceise that test is also performed on values

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
2f1ab9cc50 Create a SpectrumError(Exception)
Create an exception in case of an error due to spectrum problem
eg if spectrum request is not correct
if values are not correct.
use it instead of exit

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
42ba3eb98d Add a test to go through the new added code in spectrum_assignment
Verify that there is no raised error if M=0 and the blocking attribute is there

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
9eb87fc8e1 raise error instead of exit in assignment
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8fab9bb945 Raise exceptions instead of exit()
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
1ead232a78 Do not block computation if M=0
When a request is blocked and its M value is set to 0. This should not raise
an error in pth_assign_spectrum. This test verify that the function
does not raise an error.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
b15c8c60ab Add a DisjunctionError exception and use it
in case there is no possible disjoint path with the added constraints
(most of the time due to an inconsistant user request)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
66bdeb0e4d Catch Service error exception in the main program
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
1a2e090104 Move all main program into a main function in path_requests_run.py
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
a8e280e29b Remove exit and use ServiceError exception instead
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
edb54b02ac Another time improving formatting (codacy report)
removing unused import
change short variable names to conform to [a-z_][a-z0-9_]{2,30}$
change main variable names to conform to [A-Z_][A-Z0-9_]{2,30}$
add or remove spaces
add docstrings
correct comments and indents of cut lines

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:02 +01:00
EstherLerouzic
83d3f32fe0 Precise type of exception (codacy report)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
085a379592 improving formatting (codacy report) of path_requests_run, request and service_sheet
remove unused imports
add docstrings
conform to '[A-Z_][A-Z0-9_]{2,30}$' pattern in main
remove trailing spaces
add/remove extra spaces

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
37bd5d0404 Code coverage improvement test service creation w/o sync vector
add one service  only excel file + changes on compare.py to support
no synchronization vector case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
f788b81d21 Code coverage improvement test json generation with Z to A direction
add aa bidir request to test service error handling
TODO: write more detailed test on the bidir case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
2ff1ce6b34 Code coverage improvement test ServiceError handling
For this purpose we create a wrong request with M=0 and verify
that serviceError is correctly raised when calling Result_element class

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
41a1e40d14 Update test data files
- add the bidir information on json expected services,
- add the label object in json responses
- add the spectrum on csv responses
- add the no-path container when relevant

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
921e8d2d3c Call build_oms_list in test_disjunction
Call bbuild_oms_list in the test to correctly build all attributes
of network elements

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
c009d28f7d Update test_disjunction and test_automaticmodefeature
Add bidir argument on load_requests calls

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
898eada097 Update test_parser
function compute_path_with_disjunction needs an additional reversed path
and results must contain wavelength assignment

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
bdfc55e801 Add a ServiceError(Exception) for malformed user requests
For example requested bandwidth should be >0

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
57f264bedb Change name of br and pw variables to brate and pwr
To conform to [a-z_][a-z0-9_]{2,30}$

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
fbe4fa3cf0 add docstring to functions
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
b2ef345f35 Cut long lines over 100 char
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
471ea7dfba Correct indentation due to linter report
using pylinter3
recommends having cut lines with same indent as starting [ or (

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
1b52f638ff other time: Correct indentation due to linter report
using pylinter3
recommends having cut lines with same indent as starting [ or (

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
84ab38a75f Another time: removing extra spaces, trailing spaces and adding missing spaces
for codacy report:
     removing extra spces before , and :
     adding spaces after , :
     adding spaces around < > =

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
916e5377f8 change variables d, el to conform to '[a-z_][a-z0-9_]{2,30}$' pattern
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
534bfd881e correction of the assignment: SA must be bidir
reversed path must be computed even if bidir is not requested
because in WDM system service are all bidir.
bidir option is only to avoid lengthy propagation of reversed
path when it is not needed

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
7c4015324d Moving reversed path computation and propagation to compute_path_with_disjunction function
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8499ee52f4 Removing extra spaces, trailing spaces and adding missing spaces
for codacy report:
 removing extra spces before , and :
 adding spaces after , :
 adding spaces around < > =

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
cc1123863c Cut long lines to 100char max due to linter report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
ca382806f6 Reordering imports due to codacy report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
3559fc61c2 Introduce reversed path on CSV output
- use a new jsontopath_metric function to collect metrics
  out of a json response.
- path metrics must be shown also for reversed path if this
  is requested. This function avoids repeading code here
- add reversed metrics on the last columns of the CSV

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
33581cdcc9 Add bidir as a parameter in path_requests_run.py
if --bidir option is used on the main program, the field is set
  to true for all demands in case demands are expressed in an excel
  sheet. --bidir option does not change bidir field if the service
  file is in json format.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
991eb02964 Add the bidir parameter in transmission_main_example.py
Update the program because the same Path_request class is called
that now includes this bidir field

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
286e321a2d Add "bidirectional" field in the json service file example
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
56f158113d Implementation of bidirectional field per request
- Instead applying bidir option independantly from service demands (json or xls)
  the "bidirectional" attribute is introduced per request in the json.
  This enables bidirectional option per requests.
  if --bidir option is used on the main program, the field is set
  to true for all demands in case demands are expressed in an excel
  sheet. --bidir option does not change bidir field if the service
  file is in json format.
  Default value of "bidirectional" attribute is False.
- As a result the reversed path is propagated only if the birectional
  field of the request is True. (remember that the reversed path must
  be computed whatever the option because it is needed to compute
  spectral occupation on both directions).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
024f6ff963 Simplification de stdout
all BLOCKING_NOMODE requests have the same type of response
so the test is simplified to account for this

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8118a0f4f4 Second minor cosmetic changes
- add some help input for the main program arguments
- move and correct comments
- add empty lines on stdout to have a nice printing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
eb89d8fd86 Minor cosmetic changes
correct typos in comments and precise that results show the
mean value of SNR of all channels on stdout

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a938c1738b add reversed path information on stdout
the information on reversed path snr is shown in parenthesis

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
4f88882513 Use reversed path as a constraint for spectrum assignment
in path assignment function, path elements and reversed path
elements are concatenated to compute the overall spectrum
availability on all elements

in main program, assignment is performed after computing reversed paths

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
4e8d8b7ddd adding reversed path in the main program
add a bidir argument to make bidir propagation as optional.
Reversed path computation is not optional because it is needed
for spectrum assignment.
for all requests, if a path could be computed  a reversed path is
computed and propagation is performed on it if bidir option is on.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
afb7d75749 Add the reversed path as an attribute of Result_element
the objects contains also the information of reversed path if
the path is not None

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
488d0e1fe8 Change find_reversed_path function using the reversed_oms
reversed_oms was introdused in order to identify which oms correspond
to the reversed direction of a given oms.
with this commit we make use of this functionality and avoid the cumbersome
way that recovered the reversed path by computing shortest path in the
reversed way for each roadm. This function could not support multiple links
in parallel between two roadms, and was adding computation time.

The function first lists the oms of the pth in the reversed order and then
appends all its elements to the path. the first and last elements are transponder
and are append separately.

Unidir topology are not supported: if there is no reversed path, this raises an error

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
708442e4cd Introduce a reverse_oms function to identify bidir oms
reversed_oms is introdused in order to identify which oms correspond
to the reversed direction of a given oms. Indeed for spectrum assignment
it is mandatory to mark spectrum resources occupation on both directions
(requests are supposed bidirectional in WDM).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a2d905dfb1 Add N and M info in the stdout
the couple (N,M) is displayed only when a request is not blocked

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
d564fe3e2a Add N and M info in the CSV
The couple (N,M) is added on the last column only for non blocked requests
an empty string is used instead

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
ea21cce1c0 Add N and M in the json response
The label object is added in the response. It contains center index N value
and number of slots M, required by the request according to the computation.

The label object directly follows the hop attribute as detailed in
draft-ietf-teas-yang-path-computation.

If the path is not blocked this changes the index of the last hop information
(-3 instead of -2) and the index of the transponder for the first hop
(2 instead of 1)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
aa9b4aefbe Add the N and M information in request when spectrum is assigned
N is the center frequency index (on G694.1 grid) and M the number of slots M,
required by the request according to the computation.
for convenience, we use N and M = 0 when request are blocked. Note that N= 0 is
a valid index when M is not 0.

If the number of slot required by a request is not feasible, the request is marked
as blocked with 'NO_SPECTRUM' as blocking reason

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8655030e59 Add blocking reason on stdout
previous way only check path existence to know if the request was blocked
or not.
Now the printing checks if the blocking_reason attribute exist, and if so
adapts the printing accordingly. The reason for blocking is added on the
output.

if no path could be computed, snr, osnr and other metrics depending on path
are replaced by empty strings
else, the metrics corresponding to the computed path are shown

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8107ddeb79 Change the processing of blocking case in main program
In compute_path_with_disjunction, and in case user mode is not feasible
returns the blocking reason instead of an empty path.
If the user does not give the mode and the automatic selection does not
give any feasible mode instead of checking if a mode exists, the function
now checks the presence of a blocking reason.

if the blocking reason is among BLOCKING_NOPATH reasons, than an empty
path is returned
if the blocking reason is among BLOCKING_NOMODE, then a path could be computed
and the mode information correspond to the last explored mode.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
76c8e55f06 CSV generation is adapted to include blocking info
creation of a function to avoid code duplication: json_param creates
the relevant parameters to show on the csv based on json input

replace try/except by a test on keys:
previous way tried to get pth_el['no-path'] and is the path was
not blocked this raised a key error. Now the there is a simple check if
the key is present.

Besides, as the no-path has been change to 'no-path' container containing
a 'no-path' attribute with the blocking reason, the test is made on the
attribute so on pth_el['no-path']['no-path'].

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a7b1ab47d8 adapt response from the compute_constrained_path to return info in case of blocking
if the path is empty NO_PATH reason or NO_PATH_WITH_CONSTRAINT reason
is returned in the blocking_reason attribute

if no mode is feasible, the last explored mode is returned with the path (and
implicitly the last computed SNR). The baud_rate is derived from this last
mode

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
879f587ab9 Adapt json response to include information in case of blocking
if path could be computed: gives details of the path and on the propagation.
  - the 'no-path' attribute is changed to a 'no-path' container that contains:
      o the 'response-id' attribute
      o a 'no-path' attribute with blocking reason
      o if a path could be computed, the 'path-properties' of the path that
        was computed with the metrics

Note that this proposal to add information for blocking in the json output (instead
of a bare NO PATH) corresponds to the way PCEp is working in general, but is not yet
integrated in draft-ietf-teas-yang-path-computation model. Returning the whole path
in case of blocking in addition to blocking reason is a novelty from GNPy and was a
request from the users
        TODO : use correct ietf model when ready

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8af2d80219 Add reasons for blocking in the response
Before continuing on spectrum assignment we need to clearly identify blocking reasons:
implicit previous way (a path is missing) is not satisfactory because in case of
spectrum blocking a path was possible. So we introduce a 'blocking_reason' attribute
on requests, that will only exist if the path is blocked during the process.
The 'blocking_reason' attribute refflects the blocking reason.
Commit defines blocking types and group them depending on the existence of path or
feasibility:
    'NO_PATH': no path was computed,
    'NO_PATH_WITH_CONSTRAINT': no path was computed with this constraint
    'NO_FEASIBLE_BAUDRATE_WITH_SPACING': no path was computed due to the spacing constraint
    'NO_COMPUTED_SNR': the computed path could not give any SNR result
    'NO_FEASIBLE_MODE': the user let the program choose a mode and path was computed but
                        no mode was feasible for the set of constraints
    'MODE_NOT_FEASIBLE': the user imposed a mode, a path was computed but this mode
                         is not feasible
    'NO_SPECTRUM': a path, a mode were selected but there is not enough spectrum available on
                   this path

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
315eea1f55 Calling assignment in main program path_requests_run.py
function is called to assign spectrum to each request
result shows an additionnal column for blocking reason

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
b61e541e15 pth_assign_spectrum function assigns wl for all demands
this is a first function that assigns spectrum following the order
of requests according to the selected mode and nb of channels computed
based on requested path_bandwidth

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
81f88e78c7 Creation of the OMS structure to record spectrum assignment
oms_list contains the list of OMS and each OMS contains the list
of uid of each crossed elements (ordered)
each element is updated with the oms_id to which it belongs
each oms contains a bitmap with frequency slots according to
frequency min max defined in eqpt_config.json (SI) and in case oms
are defined elsewhere, there is an alignment of grids to ease computation
the build_OMS_list function builds the OMS list and implements oms attributes
in all network elements
Commit also contains basic functions to handle spectrum bitmap and indexes

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
87cc3dac00 Corrections with respect to second review
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:09:39 +01:00
EstherLerouzic
89e28cc7be Correct bug in parser: csv header comparison
previous comparison was done on the result from .sort(), ie None.
list.sort() method modifies the list in-place and returns None.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
2ba29a78c5 Bug: constraint not correctly interpreted
if name of the constraint input from user is not part of networks names
(typically in the case of excel input), then the program try to find a
name that is clode to the user name and  that is in the network list of
names. This list must not include trnasponder names (because transponder
end points are already listed as constraints and transponder in the
middle of a path are not supported yet)
This will be improved in the PR Ila names in constraints #278
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
f990a6c1be Correct some remaining strict loose into STRICT LOOSE
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
424e5a4786 change variable n to nel to conform to '[a-z_][a-z0-9_]{2,30}$' pattern
from codacy report

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
6a7a04ebb1 syntax correction
reordering imports call according to pylint3 + extra line removed

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
0366fc2956 Adding a no-path case for test coverage
add a no path case (request 6) in requests and expected responses.
response is also generated if path is not feasible: checks
that it is correctly handled in csv and json responses

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
48b7d71f02 add a test on json response generation
create a json response based on test file and compare it to expected
response.
comparison first checks that there is no missing or extra key
and then compares keys

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
715baf2a1c correct compare response header test to support any order
previous test assumed same order for header fields.
order the headers in the same way in order to support different
types of order

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e55cea776e improve code quality
remove unused import, use correct indents, remove extra spaces
add function description

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
b388d143fd change assert to raise AssertionError
use
  if not condition:
  raise AssertionError()
instead of
  assert condition
according to codacy recommendation

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
c592c572d8 adding a test for the csv creation
using pandas package to have an easy ordering of response column:
- test that the generated header is as expected
- read the response. In order to support different orders wrt
  response answer and field answer, test function frst orders lines
  according to response index and then to columns (fields of the answer).
check that the answers are as expected

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
dfa0a26a28 changes to improve quality
minor name refactor
indent corrections
minor fixes for spacing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
609cd94798 Remove @ in path-metric names and extend type to decimal64
@ character not correctly read with OpenDayLight yang tool used
for transportPCE project.
https://docs.opendaylight.org/en/stable-nitrogen/developer-guide/yang-tools.html#working-with-yang-model.
Changed the names of path metrics from osnr@ to osnr-

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
022f743db1 Change the exception handling if sync vector is absent
previous try section encompass errors that should not be silently
ignored. Correction pointed a default in a test file.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
1957beb1b6 Remove the filtering on transponders for correcting explicit route
anytype supported including transponder in order to accept well formed
route list from user (if user enters 'trx Lannion_CAS' this should be
accepted even if it repeats the source name)

A later PR better handles route list names

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
9ca72d6105 Correct csv creation in case of no path
previously reported the requested bandwidth, now fill it with an
empty string, same as the other fields.

This will be updated in a later PR when different kind of blocking
will be supported

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e8e126a6ce update existing test files and examples according to ietf yang model
- 'loose' and 'strict' changed to 'LOOSE' and 'STRICT'
    - n, m changed to N, M
    - unused objects removed
    - ...
Also correct templates for service and response

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
7849782173 Change content of source and destination to transponder end-points
previously contained the user-given source name in excel file,
but could differ from effective transponder source/destination

now both source and src-tp-id contain the same info (gnpy does
not make difference between node and ports)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
149a0da8c9 update csv creation according to all model changes
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
1e7c70a59b remove the restriction on fiber for the constraint
restriction was placed on fiber type because when using excel input
autodesign create fiber names with syntax not explicit to the user, so that
entering a node name as constraint may end up to a wrong interpretation eg
aa -bb  -> create names such as 'fiber aa to bb', fiber bb to aa' .
If user constraint is bb there is a ambiguity on the fiber direction.
However this is not a problem for user using json format.
So I preferred to removed this constraint now.
a later PR solves this naming ambiguity

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
c9d8282e7f enable power and nch to be optional in requests definition
since empty attributes have been removed from the requests with
previous changes, some aditional tests are needed to continue supporting
optional power and nch. (previous objects were created with null
or default values)
user may enter requests without specifying these fields, in this
case the object 'output-power' or 'max-nb-of-channel' do not exist.
the try /except form handles the corresponding exception and
default values are kept else.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e7084a2c29 correction to support empty vectors of constraints
change the naming according to model update +
previous changes authorize empty list of nodes.
This new test supports this case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
d79d2e0724 change the way transponder and mode are returned in the response
previous way used a wrong interpretation of hop-type. in ietf model,
hop-type is reserved for LOOSE or STRICT constraint description.
Instead, transponder info , according to ietf usual way, should be
included aas a new path-route-object type. such object is not yet
defined in IETF so this is a GNPY proposal to use a 'transponder' object
with type and mode attributes.
this is what has been ecncoded in reques.py for requests and for answers

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
402155c225 change the 'no path' case answer
in case no path could be computed, the answer is changed according to
ietf path-computation model to a simple 'no-path' object

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
e5ec669419 applying changes in the request dict creation
+ removing (currently) unused direction attribute.
This will be added again in a later PR when it will be used

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
8f424e8c9d correction of json generation: removing unused objects
- removing empty objects (if no content , removes the object),
    especially
  - removing empty synchronization vector if no disjunction
    constraint exists
  - removing optional power and nb of channel fields, if user
    did not specify any
when reading, first try if the object exists : try:/except to catch
this case
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
fea2b84bb9 correction of the json generation for path computation
- changes concerning names due to ietf path-computation draft changes:
        - 'unnumbered-hop' changed to 'num-unnum-hop'
        - suppression of 'label-hop'
        - 'link-diverse' and 'node-diverse' changed to 'disjointness'
      - correction because of previously wrong interpretation of the model:
        - detailed succession on include nodes moved from
          'optimisation /explicit-route-include-objects'
          to 'route-object-include-exclude'
          in 'explicit-route-objects' attribute
      - correction of keywords
        - n and m replaced by N and M
        - strict and loose replaces by STRICT and LOOSE strings
        - change the name of respponse from 'path' to 'response'
    example service file  corrected accordingly

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
Jan Kundrát
0c918940c4 Merge pull request #313 from jktjkt/codacy-assert
codacy: do not complain about asserts in the test suite
2019-10-06 21:49:22 +00:00
Jan Kundrát
a63a6ac0ec codacy: do not complain about asserts in the test suite
Writing an explicit comparison followed by argument printing *and*
handling the final error is just backwards. Let the tool be quiet, this
nonsense does not really add any value.
2019-10-06 23:12:09 +02:00
Jan Kundrát
9f58b914d2 Merge pull request #308 from jktjkt/osnr-bw-printing
Rework OSNR printing
2019-10-06 19:17:46 +00:00
Jan Kundrát
029bac4b03 utils: more descriptive name for itufl
This might have nothing to do with the ITU frequency grid (it's really
just about a uniform distribution), so let's give it a more readable
name and more readable parameters.
2019-10-06 20:43:50 +02:00
Jan Kundrát
a27ad57220 Do not use confusing default values
This looks like the root cause of bug #243, the default values suggested
that this function works in THz.

These defaults are not used anywhere, so let's get rid of something
which adds no value and actively confuses people.
2019-10-06 20:38:31 +02:00
Jan Kundrát
8d31d924f2 Remove unused function 2019-10-06 20:37:24 +02:00
Jan Kundrát
8c3b514f90 Only print out "pretty summary" when not doing the power sweep 2019-10-01 15:24:40 +02:00
Jan Kundrát
3df27fe315 Print both ASE-OSNR and final GSNR for both 0.1nm and signal-bw in power sweep
As agreed upon during today's coders call, this is something that is
needed by actual users in the field, so let's prioritize their needs
over clarity of output for demo purposes.
2019-10-01 15:21:34 +02:00
Jan Kundrát
a6087ce354 Highlight OSNR @ 0.1nm
Esther and Jonas pointed out that it is much more common to work with
OSNR measurements per 0.1nm rather than per signal bandwidth. This is in
line with what OSAs usually report and what transponder datasheets
specify.

cc #307
2019-09-27 09:22:08 +02:00
Jan Kundrát
aae0382523 docs: show a DOI
As per Alessio's request, this adds a DOI and makes this software
citeable. DOIs are assigned automatically via Zenodo in a process that's
integrated with GitHub.
2019-09-23 18:11:44 +01:00
Jan Kundrát
b0c2acb1b5 Merge pull request #302 from Telecominfraproject/develop
Merge develop into master in preparation for the 1.8 release

Highlights for the upcoming release:

- Raman simulation
- automatic building of Docker images
- bugfixes, refactoring, CI and docs improvements
2019-09-23 16:51:28 +00:00
Jan Kundrát
a52c96ae2e Merge pull request #301 from jktjkt/develop
docs: Explain how to use Docker
2019-09-22 23:36:44 +00:00
Jan Kundrát
bf28821b5b Merge pull request #299 from jktjkt/cleanup-printing
(This replaces #261 by me and #293 by Jonas.)

##     Remove debug printing from propagate()
This is inspired by #293. The original issue was that the transponder OSNR was not accounted for. Rather than making the `propagate()` and `propagate2()` more complex, let's move the actual path printing to the demo code. There's no secret sauce in there, it's just a simple for-each-print thingy, so let's remove it from the library code.
    
fixes #262

## doc: fiber length summary in km, not meters

Reading `80000m` is a bit more complex than just `80 km`. Also let's add a space between the numebr and the unit for better readability.

## examples: color highlighting of the "most interesting output"
    
I think that this SNR value represents the most important output of this  example. There's plenty of debugging info on display already, so let's make this one more prominent.
    
I was thinking about moving the highlighting to elements' each `__str__()` function, but that felt a bit like layering violation to me. I could have also changed just the transponder's `__str__()`. In the end, I think that repeating just the final GSNR at the link-end transponder makes most sense here. This is aligned with what we talked about at yesterday's (on 2019-09-18 -- note that this is a backport from #261) demo for Microsoft, after all.
2019-09-22 23:32:36 +00:00
Jan Kundrát
328bd6ea71 docs: Explain how to use Docker 2019-09-23 00:29:59 +01:00
Jan Kundrát
ec9eb8d054 examples: commented-out path printing for Esther
I'm trying to clean up the core code while not causing too much trouble
to our users. Esther pointed out that there's an internal use case at
Orange where people are looking at the incremental results when
computing paths.

I strongly dislike commented-out debugging code, but for the sake of
progress, let's put it in here. It's roughly as good as that unused
`show` parameter, and it allowed a refactoring to make the code more
readable.
2019-09-23 00:14:56 +01:00
Jan Kundrát
f8c8526045 docs: remove duplicate cd
This whole section should be nuked, but that will have to wait until
after the next release for simplicity.

fix #283
2019-09-23 01:03:30 +02:00
Jan Kundrát
d8c236bb44 examples: do not measure time taken
This is an equivalent of 9c9e3be967 for
the other example.
2019-09-22 16:44:00 +01:00
Jan Kundrát
33ff0910b8 Remove unused function
This has been marked obsolete for 11 months. It's also one of the
callers of the propagate(), so it's being touched by that change which
removes the `show` parameter. This commit will make it easier to put
debug path printing just where it's really needed.
2019-09-22 12:20:11 +02:00
Jan Kundrát
faa69917d9 example: Use "Total SNR", not "GSNR"
Various presentations from Polito are slowly changing to use "GSNR" as a
"Generalized SNR", but it's true that our code does not use this term
anywhere, and that it is not properly explained. Let's wait a bit for
this term to become a bit more mainstream and for updated docs on our
side; then this commit can be safely reverted.

Thanks to Jonas for reporting this.
2019-09-20 17:11:42 +02:00
Jan Kundrát
d9f5ca9827 docs: Document Transceiver.mode.OSNR
I am not changing the JSON key name now because of the plan to go with a
full-blown YANG model (see #266).

fixes #298
2019-09-19 09:44:56 +02:00
Jan Kundrát
c817ef7335 examples: color highlighting of the "most interesting output"
I think that this SNR value represents the most important output of this
example. There's plenty of debugging info on display already, so let's
make this one more prominent.

I was thinking about moving the highlighting to elements' each __str__()
function, but that felt a bit like layering violation to me. I could
have also changed just the transponder's __str__(). In the end, I think
that repeating just the final GSNR at the link-end transponder makes
most sense here. This is aligned with what we talked about at
yesterday's (on 2019-09-18 -- note that this is a backport from #261)
demo for Microsoft, after all.
2019-09-19 09:25:31 +02:00
Jan Kundrát
07de489d6b doc: fiber length summary in km, not meters
reading "80000m" is a bit more complex than just "80 km". Also let's add
a space between the numebr and the unit for better readability.
2019-09-19 09:25:31 +02:00
Jan Kundrát
acafc78456 Remove debug printing from propagate()
This is inspired by #293. The original issue was that the transponder
OSNR was not accounted for. Rather than making the propagate() and
propagate2() more complex, let's move the actual path printing to the
demo code. There's no secret sauce in there, it's just a simple
for-each-print thingy, so let's remove it from the library code.

fixes #262
2019-09-19 09:25:31 +02:00
Jan Kundrát
325721545e Docker: array for CMD is more idiomatic
Co-Authored-By: Jonas Mårtensson <jonas.martensson@ri.se>
2019-09-18 13:15:20 +02:00
Jan Kundrát
dbe2bf560c Travis: Docker: Fix version numbers
This is due to travis-ci/travis-ci@7422.
2019-09-17 21:03:58 +02:00
Jan Kundrát
7872cc2203 Merge pull request #295 from jktjkt/docker
Automatic building of Docker images
2019-09-17 18:44:50 +00:00
Jan Kundrát
25b4d0e755 Automatic building of Docker images
A third, independent job in Travis CI just for building a container
image (and testing it a little bit). Here's how to use it once it is
built and published and pulled:

  $ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy

One can also pass commands to it directly. Note that the *relative* path
specifier given as `./` is required. One could possibly get around it by
$PATH manipulation, but hey, I have to stop somewhere.

  $ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy ./transmission_main_example.py

Thanks to Jonas for the --volume option trick and for that early
non-overwriting copying for the example directory.

The `install` phase in Travis CI is apparently common for all jobs in
the build matrix (it's rather confusing what they call a "stage" and
what is a "job" for them, and what their mutual relation is). Because
there's no point in doing any kind of installation in case when we're
building for Docker, let's just move the old `install` block into
`script`. With just that, however, Travis adds an implicit `pip install
-r requirements.txt` in an attempt at being helpful. In short, having
completely different "stuff to be done as a check" is very painful with
Travis.

The individual "script lines" in a job's `script` section are always
executed, all of them, even if a previous one failed. That's why I moved
the actual Docker invocation into an external shell script. When they
were not in a shell script and the script lines contained `set -e`, then
a failure of a particular command would cause the build to be marked as
"errored" instead of "failed" within Travis. Also, the multiline if
conditionals were rather painful to write.

One commit might end up in different branches. If that happens, the
second build would overwrite the image tag in the docker registry, which
is rather suboptimal. Let's try to fetch that image first, and only
update the latest/stable tags if the image was already available
beforehand.

fixes #260
2019-09-17 20:37:45 +02:00
Jan Kundrát
9af1c90664 Merge pull request #249 from jktjkt/bug-243
Do not mix THz and Hz
2019-09-04 08:18:49 +00:00
Jan Kundrát
6b4d44a3f1 Merge pull request #292 from ojnas/fix-snr-per-channel
Fix per-channel SNR output
2019-09-03 08:23:28 +00:00
Jonas Mårtensson
2faf8d2cdd - Fix output per-channel SNR to include contribution from add_drop_sonr and tx_osnr
- Avoid recalculating SNRs in transmission main example since they are already calculated and stored in the Transceiver class
- Also include per channel power (useful for checking tilt) and OSNR ASE in the output
2019-09-03 09:25:09 +02:00
Jan Kundrát
676c94ddf2 Merge pull request #289 from jktjkt/raman-nli-interpolation 2019-09-02 21:02:58 +00:00
Jan Kundrát
6f93b64f84 Simplify logic for showing per-channel results
As Jonas pointed out, the code used to contain a check for non-nan
values, effectively skipping channels where the Raman gain was not
explicitly computed.

Now that we do not introduce NaNs into some channels anymore, this
shortcut no longer works. We could either add explicit filtering for
only showing those channels which are covered by the Raman engine, but
then the result would be rather confusing in the non-Raman case. One
could also add another column with the simulated vs. approximated NLI,
but when I tried this, the output looked a bit cluttered.

I think that the best course of action for now is to just show info
about all channels (if asked by the user). So this is just a cleanup for
a condition which is now always on.
2019-09-02 21:10:20 +02:00
Jan Kundrát
54bf426472 Raman: linear interpolation of channel NLI
The Raman engine computes NLI just for a subset of channels; this is an
important speed optimization because the computation is rather CPU
heavy. However, the code just left NaNs in place for NLI of those
channels which were not explicitly simulated.

This is wrong because these NaNs propagate all the way to the total
input/output powers per the whole spectrum, etc, leading to NaNs being
shown to the user.

This patch uses a very simple linear approximation just in order to
prevent these NaNs.

fixes #288
2019-09-02 21:10:20 +02:00
Jan Kundrát
1862ce9104 Merge pull request #263 from aleFerrari/raman_feature
Raman feature

I've added some very basic documentation (really, just usage instructions). What is still missing:

- more specific docs
- test coverage (right now, there's none!)

We've agreed to merge this sooner rather than later so that more people can reasonably test it. There's a bunch of things that I'm not happy with, such as the default example showing NaNs as the OSNR to the user at the end transponders, etc (I suspect this is due to simulating just some channels and no approximation, right?), but let's solve these based on feedback from testing.

As proposed in my earlier comments, I went ahead and merged the wrapper example code into the `transmission_main_example.py`. So here's what needs to be done in order to "use Raman amplifiers" in a basic example, with no autodesign:

- replace some `Fiber` elements with a `RamanFiber` in the network topology JSON
- invoke the example with a `--sim examples/sim_params.json` which activates the Raman engine
- pass the `--show-channels` flag so that the actual results are visible

This is more or less covered by the docs skeleton.
2019-08-08 12:05:53 +00:00
Jan Kundrát
3771c13d32 docs: basic usage instructions for Raman 2019-08-08 13:38:23 +02:00
Jan Kundrát
f1d0230dad docs: skeleton of Raman-specific properties
What I'm including here are comments from Vittorio (thanks!). It seems
that the majority of actual numeric parameters which drive the
simulation are undocumented, so this one will have to wait for Alessio
or someone else from the Polito team to fill the blanks.
2019-08-08 13:38:23 +02:00
Jan Kundrát
182929cc96 Unify transmission_main_example and the Raman wrapper
There were just these substantial differences:

- the Raman code showed a per-channel SNR summary, this is now
  controlled via the `--show-channels` option

- when a Raman fiber is used the `--sim` option for specifying input
  simulation parameters is now mandatory

I'm therefore merging these two files even though we've rpeviously
decided not to do this -- consult the review comment at
https://github.com/Telecominfraproject/oopt-gnpy/pull/263#discussion_r310506082
and the discussion during this Tuesday's coders call). If this turns out
to be a problem for autodesign, we can always revert this.

One possible catch is that the final "SNR total" shows NaN for the
default Raman example. That's just the way the simulation engine works
right now, I'm afraid. The `--show-channels` options helps a lot.
2019-08-08 13:38:23 +02:00
Jan Kundrát
81585c5a86 Unify implementations of psi computation
Both of these places referred to "eq. 123 from arXiv:1209.0394", the
only difference (apart from the source of the input parameters, beta2
and asymptotic_length) was calling the two branches "SCI" and "XCI" vs.
"SPM" and "XPM".

In this commit I've only moved the code to a single implementation. The
input data are still being read from the same parameters, of course.
2019-08-08 13:38:23 +02:00
Jan Kundrát
2f52c11589 More intuitive name for list of channels where Raman gain is computed 2019-08-08 11:34:33 +02:00
Jan Kundrát
0f4d8573cf Move Raman parameter propagation to gnpy.core.network
Conceptually, this is just about propagating the input parameters (which
drive the simulation) into all RamanFiber instances. The network module
already contains similar functions, let's move it there.
2019-08-08 11:19:25 +02:00
Jan Kundrát
660b8b3c6e Use lin2db() when we have it 2019-08-08 11:16:26 +02:00
Jan Kundrát
71d6a1138c Fix a typo in my e-mail address
This has been around since before 1a104956, but I haven't noticed the
missing `.com`.
2019-08-08 10:53:52 +02:00
Jan Kundrát
a6e741d8fe Do not copy the whole Fiber class 2019-08-08 09:55:23 +02:00
Jan Kundrát
58bcf65cf6 Raman: do not introduce a new copy of the equipment library
Compared to `eqpt_config.json`, the only extra content in the new copy
was the `RamanFiber` block. There's no disadvantage in just using one
equipment library; the traditional code can easily ignore the RamanFiber
stanza.
2019-08-07 23:45:44 +02:00
Jan Kundrát
27ce55de38 diagnostics: Correctly report all spans, not just the Raman ones
It's just a harmless message, the path traversal was still using all
connections as defined in the topology JSON.
2019-08-07 23:41:30 +02:00
Jan Kundrát
36ca22db9b Raman: use logging instead of debugging print()s 2019-08-07 23:31:20 +02:00
Jan Kundrát
33a8de9b39 Raman: stricter validation of input parameters
The goal here is to try to prevent typos in configuration from slipping
in undetected.
2019-08-07 13:36:21 +02:00
Jan Kundrát
22b76e36db Fix typos in Raman's frequency offset
This is a typo, the parameters were clearly supposed to be uniformly
spaced. It doesn't affect the results (much), because the remaining
values are "close enough" for a reasonable interpolation.

Using the existing Raman example, the difference in OSNR ASE at the
signal bandwidth is 0.01 dB (31.46 -> 31.47 dB).

Perhaps it would make sense to enforce that these offsets are a
monotonic sequence.
2019-08-07 13:19:17 +02:00
Jan Kundrát
528ff31590 CI: Run the Raman example as well 2019-08-06 11:56:01 +02:00
Jan Kundrát
4d6966cbd3 Remove unused imports 2019-08-06 11:51:25 +02:00
Jan Kundrát
9c9e3be967 Do not measure time from the example directly
Let users wrap this with the Unix `time` command if they are interested
in the time spent.
2019-08-06 11:49:32 +02:00
Jan Kundrát
2dd4745ef7 Merge pull request #281 from jktjkt/no-convenience-access
Remove property aliases
2019-08-06 09:46:36 +00:00
Jan Kundrát
4e786a32b5 Merge branch 'no-convenience-access' into raman
This required some adaptations in the new Raman code now that the
property aliases are gone.
2019-08-06 11:43:16 +02:00
Jan Kundrát
6ecb2c85e2 Merge remote-tracking branch 'origin/develop' into raman
Required fix-ups:
- ROADM restrictions
2019-08-06 11:34:39 +02:00
Jan Kundrát
cd234a909b Remove unimplemented and unused code 2019-08-06 11:22:56 +02:00
Jan Kundrát
c249f44ea1 Remove property aliases
For some reason, the code allowed using "convenience names" for
accessing properties since commit 58ac717f. To me, this looks like an
obvious anti-pattern because accessing a single property via three
different names only makes the code less readable. Let's kill this
"feature".

In case of the `Power` class, the code used "ase" and "nli" on the
majority of places, so let's use these abbreviations instead of their
spelt-out variants.

SpectralInformation was "clean" already, but there were calls to the
`update()` wrapper around the `namedtuple._replace`.  Given that there
were no property aliases, it's safe to just call `_replace()` directly.

In case of the `Pref` class, once again always use `p_span0`, `p_spani`
instead of `p0` and `pi` -- it's a trivial change.
2019-08-06 11:14:28 +02:00
Alessio Ferrari
ed1f51393a method name for computing NLI updated in sim_params.json 2019-08-01 14:54:33 +02:00
Alessio Ferrari
8bd43130ab modified Raman pumps in the example raman_edfa_example_network.json to make Raman effect more evident 2019-08-01 14:48:16 +02:00
Alessio Ferrari
6c975a53a1 minor fix to eqpt_with_raman_config.json 2019-08-01 14:45:17 +02:00
Alessio Ferrari
8a1001cd40 dynamic evaluation of threshold frequency between near XPM and far XPM 2019-08-01 14:42:47 +02:00
Alessio Ferrari
beb2b576aa changed output of propagate_raman_fiber to return a list 2019-08-01 14:41:40 +02:00
Alessio Ferrari
8f3923046b alpha0 computation moved from NLI solver to Fiber Params 2019-08-01 11:36:32 +02:00
Alessio Ferrari
88c68d2065 introduced classes for the parameters 2019-08-01 11:32:07 +02:00
Alessio Ferrari
8bcde72a10 changes on variable names to be more clear 2019-08-01 10:49:25 +02:00
Alessio Ferrari
4653dbcf4b introduced a faster computation for XPM 2019-08-01 10:48:02 +02:00
Alessio Ferrari
cde08b32a4 output fixes on transmission_with_raman_main_example.py 2019-08-01 10:45:20 +02:00
Alessio Ferrari
2eed891f8d new variable names for SPM and XPM 2019-07-31 11:29:25 +02:00
Alessio Ferrari
c0b84e84c8 double underscore replaced with single one for some attributes 2019-07-31 11:24:16 +02:00
Alessio Ferrari
2c20fd3f9f strings 'XPM' and 'SPM' removed while computing NLI, now it is implicit 2019-07-31 11:20:34 +02:00
Alessio Ferrari
f4db56ca29 minor fix to comments 2019-07-31 11:18:46 +02:00
Alessio Ferrari
5b6d58ac7d minor fix to printed text 2019-07-29 14:45:32 +02:00
Alessio Ferrari
ecb8bd9fbe new print of final propagation 2019-07-29 14:17:27 +02:00
Alessio Ferrari
2f9385451f removed old printing of spectral information 2019-07-29 14:16:49 +02:00
Alessio Ferrari
1a1346461b self.pch_out_db in RamanFiber now takes into account Raman gain 2019-07-29 12:33:05 +02:00
Jan Kundrát
27dcd29074 CI: Attempt to builds docs during regular CI, too
Since ReadTheDocs does not support [1] running as a CI check on GitHub
yet, let's make sure that we at least run sphinx. This would have caught
a recent docs breakage (see parent commit).

[1] https://github.com/readthedocs/readthedocs.org/issues/1340
2019-07-04 00:37:58 +02:00
Jan Kundrát
93986f36c3 docs: Fix docs building
Docs building started failing after our dependency update. The reason is
that the updated sphinx bibtex extension started being a bit stricter in
their interpretation of bibliography files:

  parsing bibtex file /home/jkt/work/TIP/oopt-gnpy/docs/biblio.bib...
  Exception occurred:
    File "/home/jkt/work/TIP/_py/lib64/python3.6/site-packages/pybtex/errors.py", line 78, in report_error
      raise exception
  pybtex.database.input.bibtex.DuplicatePersonField: entry with key bononi_modeling_2012 has a duplicate year field

Fix this by using `date` as field name when a full date is given.
2019-07-04 00:37:23 +02:00
Jan Kundrát
a6ab8055b1 CI: do not attempt to cd twice
I misunderstood how Travis CI works; it's not a subshell, it's something
which affects the other commands. Let's just run everything from the
top-level directory.
2019-07-04 00:28:33 +02:00
Jan Kundrát
31ea479d7f Merge pull request #268 from jktjkt/prune-dependencies
Update and rework dependency handling

fixes #269
2019-06-25 15:25:48 +02:00
Jan Kundrát
89fb2e047b Update dependencies
Pinning dependencies to a specific version is safe in terms of
preventing accidental breakage, but it has a cost of possibly using
outdated dependencies.

Let's see if we can rely on the semantic versioning of our dependencies
here.

I'm doing this instead of other approaches (such as splitting the
top-level deps from "the rest of the dep chain" [1][2]) because I require
additional tools in my venv, such as the python-lnaguage-server. There
would be little point in injecting these into requirements to be used by
other people.

[1] https://www.kennethreitz.org/essays/a-better-pip-workflow
[2] 59eb51c026
2019-06-21 12:13:38 +02:00
Jan Kundrát
8f705e6173 requirements: only list what we really need
Listing all transitive dependencies, including their respective versions
(as done by `pip freeze`, for example) is something which effectively
leads to installing outdated and possibly vulnerable software.

Let's get rid of the transitive dependencies.

Thanks to Esther for noticing.
2019-06-21 11:46:17 +02:00
Jan Kundrát
8f735316f5 Merge pull request #264 from Telecominfraproject/master
Merge master into develop

...so that the docs are updated and synced.
2019-06-19 14:28:45 +02:00
Jan Kundrát
0d7a1871a1 Merge pull request #259 from jktjkt/docs
docs: Show a pretty picture of GNPy in action

Thanks to Esther for suggesting this, and to Gert and Esther for their reviews.
2019-06-19 11:57:55 +02:00
Jan Kundrát
33832b3d25 docs: Animated invocation of transmission_main_example.py
Tools used:
- [asciinema](https://asciinema.org/) for recording the session
- `$EDITOR` for a few simple fix-ups
- [termtosvg](https://github.com/nbedos/termtosvg) for creating the
resulting SVG so that it's fully self-contained
- pushing the results to the `gh-pages` so that it's available and not
subject to [random filtering and
breakage](https://stackoverflow.com/questions/13808020/include-an-svg-hosted-on-github-in-markdown),
such as the animated SVG not actually animating

I'm still linking to asciinema.org because that site offers nice
cut-and-oaste capabilities, playback control, etc.

Co-Authored-By: Gert Grammel <ggrammel@juniper.net>
2019-06-19 11:54:44 +02:00
Alessio Ferrari
4da7f0cc38 added evaluation of the SNR at the end of the line 2019-06-18 13:31:50 +02:00
Alessio Ferrari
e29f8485ea integration of the GGN-model with spectral separation 2019-06-18 13:31:01 +02:00
Alessio Ferrari
2da344a563 sim_params now include GGN model 2019-06-18 13:27:02 +02:00
Alessio Ferrari
2a0cb8e14f raman pump powers reduced in raman_edfa_example_network.json 2019-06-18 13:21:05 +02:00
Jan Kundrát
e1dc3dc357 docs: no need to cd examples first
The code determines these paths relative to the actual example .py file
already. Given that `pytest` and other bits expect to run from the
top-levle directory, let's remove this misleading instruction from the
docs.
2019-06-17 20:01:36 +02:00
Jan Kundrát
8259124f73 docs: Show a pretty picture of GNPy in action
I do not have a vector image of this one, unfortunately. The data
apparently comes from "someone at TIP", perhaps a hired graphic
designer. It was passed to me by Diego Landa when I asked for the
original dataset.
2019-06-17 19:55:00 +02:00
Alessio Ferrari
0422956ac6 RamanFiber propagate method now call the propagate_raman_fiber in the science_utils module 2019-06-11 13:40:42 +02:00
Alessio Ferrari
ff82c5171b propagate_raman_fiber function introduced 2019-06-11 13:39:41 +02:00
Alessio Ferrari
f9bd6310f1 introcude NLI solver 2019-06-11 13:38:39 +02:00
Alessio Ferrari
471eab126e fix raman solver parameters 2019-06-11 13:38:05 +02:00
Alessio Ferrari
6ad011d12d raman_efficiency included in RamanFiber equipment 2019-06-11 13:28:39 +02:00
Alessio Ferrari
561c8aff85 modified sim_params for transmission_with_raman_main_example.py 2019-06-11 13:27:40 +02:00
Alessio Ferrari
5cf5dd2234 add temperature parameter in fiber 2019-06-11 13:24:12 +02:00
Alessio Ferrari
fb9915d301 add raman efficiency in SSMF equipment 2019-06-11 13:23:38 +02:00
Jan Kundrát
7ab67194d6 Merge a3778dfe8b into 603ac9d8c5 2019-06-11 08:47:09 +00:00
Jan Kundrát
603ac9d8c5 Merge pull request #257 from jktjkt/fixes
Python: do not use a mutable default value
2019-06-11 10:46:25 +02:00
Jan Kundrát
a3c7811e9d Merge pull request #256 from jktjkt/docs
Documentation fixes (plus an equipment.Fiber.beta2 change)
2019-06-11 10:45:23 +02:00
Jan Kundrát
a3778dfe8b Merge pull request #255 from jktjkt/weekly-calls
Encourage people to join us and to join the VCs
2019-06-11 10:45:00 +02:00
Jan Kundrát
2dff934612 CI: run the examples as well
We are not checking their results or anything, but it is nice to be able
to say "hey, these still work".
2019-06-07 00:02:53 +02:00
Jan Kundrát
89d666948e Fiber: beta2: use a default value directly
Rather than do the None-dance and document the default value within a
docstring, let's use the default value directly. This is safe because
numbers are immutable.
2019-06-06 23:55:16 +02:00
Jan Kundrát
c3499142b0 doc: hyperlinks++ 2019-06-06 23:47:52 +02:00
Jan Kundrát
d8feccc715 Python: do not use a mutable default value
Because default arguments are evaluated *once*, not every time they are
called, a mutable default value is not "reset", and this happens:

>>> from gnpy.core.node import Node
>>> x=Node('123')
>>> y=Node('456')
>>> print(x.metadata)
{'location': Location(latitude=0, longitude=0, city=None, region=None)}
>>> print(y.metadata)
{'location': Location(latitude=0, longitude=0, city=None, region=None)}
>>> y.metadata['foo']=123
>>> print(x.metadata)
{'location': Location(latitude=0, longitude=0, city=None, region=None), 'foo': 123}

This is easily fixable by using an immutable value as a placeholder
here.
2019-06-06 23:29:07 +02:00
Jan Kundrát
16173355f3 docs: random improvements and Sphinxiation 2019-06-06 23:26:12 +02:00
Jan Kundrát
f46134fda5 docs: document the _private methods
This is a reference documentation, so it makes sense for it to be
reasonably complete.
2019-06-06 23:05:42 +02:00
Jan Kundrát
bfecff0412 docs: use a default preset here to prevent extra repetitions 2019-06-06 23:04:34 +02:00
Jan Kundrát
168f1891cf docs: ensure that all modules are documented 2019-06-06 22:55:56 +02:00
Jan Kundrát
862845b4ac docs: minor grammar fixes 2019-06-06 21:51:48 +02:00
Jan Kundrát
b7a5dbff49 Encourage people to join us and to join the VCs
This was suggested by Esther on today's PSE group call.
2019-06-06 18:39:21 +02:00
Jan Kundrát
5be30d89a7 Merge pull request #230 from Orange-OpenSource/build_no_amp_in_roadm
ROADMs can now specify which amplifiers can be used as their preamps and boosters.
2019-06-06 12:47:06 +02:00
Esther Le Rouzic
d94dc51d88 Restrictions on auto-adding amplifiers into ROADMs
This feature is intended to support designs such as OpenROADM where the
line degree integrates a specific preamp/booster pair. In that case, it
does not make sense for our autodesign to "pick an amplifier". The
restrictions can be activated by:

- Listing them in `eqpt_config.json`, so that they are effective for all
ROADM instances.
- On a per-ROADM basis within the Excel sheet or the JSON definitions.

Restrictions apply to an entire ROADM as a whole, not to the individual
degrees.

If a per-degree exception is needed, the amplifier of this degree can be
defined in the equipment sheet or in the network definition.

If no booster amplifier should be placed on a degree, use the `Fused`
node in place of an amplifier.

Signed-off-by: Esther Le Rouzic <esther.lerouzic@orange.com>
Co-authored-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
2019-06-06 11:58:45 +02:00
Jan Kundrát
22acd88d44 Utility functions for pruning and merging
Co-authored-by: Esther Le Rouzic <esther.lerouzic@orange.com>
2019-06-06 11:42:05 +02:00
Alessio Ferrari
fd406c106b gn model introduced in the science utils module 2019-06-03 16:33:26 +02:00
Alessio Ferrari
16134b5caf +1 and -1 replaced with coporop and counterprop strings for Raman pumps description 2019-06-03 16:27:58 +02:00
Jan Kundrát
2c485efced Merge pull request #252 from Orange-OpenSource/bug_fixes_create_eqpt_sheet
Refactoring and bug fixes in an auxiliary transformation tool.
2019-05-31 16:26:10 +02:00
EstherLerouzic
279d08a0e8 Correct testTopology file
testTopology listed corlay twice in Eqpt although it is a ILA.
should appear only once in node A column

update expected parser results for this change in tests/data

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-05-31 11:31:46 +01:00
EstherLerouzic
1d4a8998e1 Bug fix on create_eqpt_sheet.py
program was not correctly listing nodes when links duplicate EDFA entries.
I refactored it to make it simpler to understand.

I added coordinates on ../tests/data/testTopology.xls for plot purpose

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-05-31 11:31:46 +01:00
Jan Kundrát
47a41e7980 Merge pull request #251 from jktjkt/json-parsing-exceptions
Use exceptions instead of `sys.exit()` when reporting configuration errors
2019-05-31 11:24:55 +02:00
Jan Kundrát
ecfc4a8cb2 One fewer exit() in a library scope 2019-05-30 13:58:33 +02:00
Jan Kundrát
2d66b6266b Add a missing FIXME for direct-printing of diagnostics 2019-05-30 12:39:42 +02:00
Jan Kundrát
b7afb5f9d2 Exceptions for errors in network topologies 2019-05-30 12:39:10 +02:00
Jan Kundrát
58c16a59ac Distinguish between "equipment library errors" and other "config errors"
And because no code for these "other config errors" has been merged yet,
this is just a placeholder for future work.
2019-05-30 12:28:47 +02:00
Jan Kundrát
f09789f5ef Refactor color temrinal escaping into a common module
I have no idea which one of the existing pypi modules is best for these,
and given that we're using just two escape codes, I think it makes sense
not to bother with a more capable third-party module just for two magic
strings.
2019-05-30 12:25:53 +02:00
Jan Kundrát
b2e12cd3e0 Use exceptions instead of direct-exit when parsing equipment JSON 2019-05-30 12:06:54 +02:00
Jan Kundrát
71b157a8ba Infrastructure for reporting configuration errors via exceptions
Once the actual config-parsing code start raising these exceptions
instead of directly calling sys.exit(), the user experience would
deteriorate due to raw exception traces. There's little value in the
trace itself, so just wrap the whole config loading with pretty error
formatting.

We still do not point to a specific place where that error is defined
(such as a line/column in a given JSON file) because that information is
already lost by the time we perform these checks.

Also, these checks are largely open-coded ad-hoc stuff. Some required
items are not covered, raising KeyError instead. We should get a formal
schema for these...
2019-05-30 12:06:54 +02:00
Alessio Ferrari
cb8affe9b2 transmission with raman integrated with sim_params configuration 2019-05-28 12:19:01 +02:00
Alessio Ferrari
3f7180c706 configure_network function created to configure the RamanFiber.sim_params in a network 2019-05-28 11:35:42 +02:00
Alessio Ferrari
f0bc2dc62f attribute sim_params added to RamanFiber 2019-05-28 11:11:40 +02:00
Jan Kundrát
9c95fd6b69 Do not mix THz and Hz
While the `itufl()` function uses THz by default, its values depend on
the frequency range that is passed via arguments. In this context, the
f_min and f_max come from the default SpectralInformation which is in
Hz.

Thanks to @ojnas for reporting this.

fixes #243
2019-05-27 18:35:14 +02:00
Alessio Ferrari
c0fda8c3a2 fix to Raman solver 2019-05-27 17:09:34 +02:00
Alessio Ferrari
bac20af381 eqpt_with_raman_config.json copied from eqpt_config.json and RamanFiber added into it 2019-05-27 17:08:36 +02:00
Alessio Ferrari
626211a320 Introduce RamanFiber in equipment 2019-05-27 17:04:33 +02:00
Jan Kundrát
783aaa8cb4 Merge remote-tracking branch 'origin/master' into develop 2019-05-27 12:27:23 +02:00
Jan Kundrát
768bd8af19 Merge pull request #247 from jktjkt/rst-fixes
docs: fix RST formatting and introduce CI coverage
2019-05-27 12:26:42 +02:00
Jan Kundrát
3894f52194 CI: test the RST files for validity
fixes #245
2019-05-27 12:08:02 +02:00
Jan Kundrát
dcfa9edb1c docs: Fix JSON syntax 2019-05-27 11:36:02 +02:00
Jan Kundrát
4ebdb5629c docs: specify code-block highlighter 2019-05-27 11:35:56 +02:00
Jan Kundrát
75b0668fc2 docs: Fix JSON syntax -- standard ASCII quotes 2019-05-27 11:35:52 +02:00
Jan Kundrát
5fe94ed463 docs: specify a correct markup language for JSON snippets
Some of these are "pseudo-JSON" with `...` etc, so one has to use `none`
for these.
2019-05-27 11:35:49 +02:00
Jan Kundrát
db21b97603 docs: fix table markup syntax
Oh boy, this is annoying.
2019-05-27 11:35:47 +02:00
Jan Kundrát
8074d0c548 docs: fix code markup
- use a correct highlighter
- ensure that the code is not parsed as RST code-block options
2019-05-27 11:07:53 +02:00
Jan Kundrát
2d611afbb0 Merge pull request #241 from Telecominfraproject/master
Merge master into develop
2019-05-24 11:36:46 +02:00
Jan Kundrát
bc42507724 Merge pull request #239 from jktjkt/spectrial-info-fixes
docs: Fix rendering of the SpectralInformation docs
2019-05-24 11:24:00 +02:00
Jan Kundrát
ff82ab5718 docs: Fix rendering of the SpectralInformation docs
Due to a misaligned |, the table was not rendered at all in the GitHub
overview.

Thanks to Gert for catching this in his brownfield doc.
2019-05-24 11:00:14 +02:00
Alessio Ferrari
62fe374e15 modified fiber type in raman_edfa_example_network.json to use new class RamanFiber 2019-05-24 10:38:24 +02:00
Alessio Ferrari
e519a3bc39 add class RamanFiber 2019-05-24 10:36:29 +02:00
Jan Kundrát
4f146d12ee Merge branch 'pytest-warnings' into develop 2019-05-24 01:45:53 +02:00
Jan Kundrát
46d6074ad5 Merge branch 'trivial-bugfixes' into develop 2019-05-24 01:40:07 +02:00
Jan Kundrát
cbb61f1240 tests: consult the doctests as well
There are no doctests right now, but they will (might?) be added (see
PR #230).
2019-05-24 01:38:12 +02:00
Jan Kundrát
0e9f3c3576 tests: re-enable warnings
For some reason, warnings were disabled in commit 07de78c, but it is not
clear to me what the purpose was back then. It passes without warnings
locally, and I think that warnings are useful in general, so let's give
the CI a try.
2019-05-24 01:37:49 +02:00
Alessio Ferrari
92f11dc075 add an example of json including Raman pumps in fiber 2019-05-23 15:05:08 +02:00
Jan Kundrát
3aa0a0999b Update networkx and xlrd
...in an attempt to kill some deprecation warnings under Python 3.7.
2019-05-22 17:46:41 +02:00
Jan Kundrát
e86fbcfa5b CI: Disable Codecov comments
The results are still perfectly visible within the *Checks* tab, so
let's cut the comment spam a little bit.

I've already done the same for lgtm.com; for that app, it's done in
their service's configuration within the web app.
2019-05-22 17:33:13 +02:00
Alessio Ferrari
0b2ee6fdaf add Raman solver 2019-05-22 12:05:51 +02:00
Alessio Ferrari
35f3866882 load_sim_params used to parse sim_params.json and new default filename 2019-05-22 11:06:58 +02:00
Alessio Ferrari
d86bea80d3 fix load_sim_params 2019-05-22 11:00:02 +02:00
Alessio Ferrari
13b4b5072f fix sim params 2019-05-22 10:59:00 +02:00
Alessio Ferrari
efae43f122 utility to load sim params from json 2019-05-22 10:26:24 +02:00
Alessio Ferrari
45e8c8692b add science utils module 2019-05-22 10:09:27 +02:00
Alessio Ferrari
f8fc2a5050 simparams enriched with Raman and NLI parameters 2019-05-22 10:02:45 +02:00
Jan Kundrát
3c20d57cc4 Be sure that both nf_min and nf_max are removed
If `nf_min` was not present, then `nf_max` would be left remaining
because an attempt to nuke `nf_min` raised an exception.

Thanks to codecov.io for making it easy to spot this on the coverage
page.
2019-05-21 16:29:09 +02:00
Jan Kundrát
2cb3858330 Exceptions must be raised, not just instantiated
Found by lgtm.com code scanner, thanks!
2019-05-21 16:20:26 +02:00
Jan Kundrát
925d36a561 CI: do not require sudo
...which might give us a faster, container-based build environment as
long as some legacy, several-years-old docs are still correct. Let's
give it a try.
2019-05-21 13:04:44 +02:00
Jan Kundrát
0ffaca91cc CI: Test on Python 3.7 as well
According to the Travis-CI docs, this requires switching to a newer base
OS image. It might not require an explicit pin like this in near future
as the default is, apparently, being migrated at this very time, but I
do not feel like working with a random rolling-update process of a
third-party service, thank you.
2019-05-21 12:27:41 +02:00
Jan Kundrát
d3a0f1d969 CI: remove unused section 2019-05-21 12:17:33 +02:00
Jan Kundrát
37704db583 CI: Report test code coverage to codecov.io 2019-05-21 12:09:24 +02:00
Alessio Ferrari
aadd038bbe transmission w Raman gets sim params 2019-05-20 17:17:34 +02:00
diegolaunch
6d601b4267 Merge pull request #232 from jktjkt/unify-authors-list
Add all missing authors and sync the two lists of contributors
2019-05-20 08:10:24 -07:00
Jan Kundrát
194798d881 Add all missing authors and sync the two lists of contributors
These two lists were not synced with each other, and some contributors
whose commits have been merged in git were missing. The resulting list
is a bit longer because not everybody who contributes does that strictly
via commits pushed to GitHub.
2019-05-20 12:31:07 +02:00
Jan Kundrát
1a10495645 Replace James with myself as a maintainer
The credits/attributions have not been touched of course.
2019-05-20 12:31:04 +02:00
Alessio Ferrari
0e2316513e add raman transmission 2019-05-18 15:41:59 +02:00
Alessio Ferrari
72ce4e2fad add json sim_params 2019-05-18 15:39:48 +02:00
Jan Kundrát
4ad7311e18 Merge branch 'develop' 2019-05-17 21:25:08 +02:00
Jan Kundrát
fa2b0e8fad docs: use github for release management
Teh changelog is available over GitHub (and also within the git tags,
now that I added the missing v1.2 tag), so let's cut the verbosity a bit
by just linking to a dedicated page for this trivia.
2019-05-17 20:57:38 +02:00
Jan Kundrát
78eb926693 docs: prune the list of branches
The README is way too long already. Welcoming potential users and
contributors with an overview of the project's history does not help,
IMHO.
2019-05-17 20:46:37 +02:00
Jan Kundrát
3613efbaab docs: there's the --power argument for launch power config 2019-05-17 20:31:50 +02:00
Jan Kundrát
2e732854b3 docs: Improve typography and fix typos in the README
It's quite common to use a ``monospace font`` for referring to
identifiers, so let's use that for GitHub's rendering of the main
project's README.
2019-05-17 20:29:14 +02:00
Jan Kundrát
f9560d6b1d Merge branch 'master' into develop
This is mainly to resolve one conflict in README and to have CI run its
tasks.
2019-05-16 13:09:42 +02:00
Jan Kundrát
51b0826398 Merge pull request #226 from Orange-OpenSource/bugs_correction
correction of 2 bugs
2019-05-16 13:06:09 +02:00
Jan Kundrát
af0adb454d Fix RST syntax in the ROADM table
The table was completely hidden from the HTML rendering at GitHub. The
space between the "left edge" of the table and the text content appears
to be substantial.
2019-05-16 12:41:01 +02:00
EstherLerouzic
cdd4c571b0 correction of bugs
exception KeyError type in service sheet not correctly catched
print Request.mode instead of Requestmode (not defined at this point)

selection of modes did not respect min spacing criterium:
constraint added

transmission_main did not give SNR in 0.1 nm: added in std out

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-05-07 12:28:28 +01:00
James
9c440764c7 Merge pull request #224 from Orange-OpenSource/auto_design3
Auto design3
2019-04-18 11:42:09 -04:00
James
6e94834033 Merge pull request #223 from Orange-OpenSource/addWeightEdge
Add weight edge
2019-04-18 11:41:46 -04:00
EstherLerouzic
1720ed23c9 Sort all simple paths using length instead of hop
when constraints such as include noe or disjuntion is applied
the candidates are sorted with shortest path in length first

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-18 11:16:07 +01:00
EstherLerouzic
137fab1d92 Adding weights on edges to have shortest path in length instead of hops
add weight = length of fiber nodes on connections wher from_node is a fiber
add weight = 0.01 km on other edges

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-18 11:16:07 +01:00
Jean-Luc Auge
0fee63fa81 manage empty edfa requirement list in auto_design
code exit if no edfa are available and raman do not fill min gain
requirement

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
d5f0d80eed clean up auto_design code
replace lambdas, filter and map functions with comprehension list

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
b7d4d43f56 improve auto_design selection of raman
-separate edfa and raman_list of acceptable amplifiers to better
customize criteria between edfa and raman amplifiers
for example min gian requirement: no extended min gain range for raman
amplifiers because there is always an edfa solution, which is better for
small gains.

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
c0379a1981 verbose possibility for paht_request_run
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
e5db8e42d1 Display warning when Raman is used above max fiber lineic loss
- in auto-design, Raman is already not used if lineic fiber loss >
eqpt_config.json [Span][max_fiber_lineic_loss_for_raman]
- this commit ensures that a warning is displayed even when auto-desing
is not used (when the network equipment configuration is imported from
json or xls topology)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
27cf9806f0 bug fix in mixed auto/imported design with output voa
non zero voa values were transferd to the following node

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
5dbb5cd112 bug fix with FUSED spans and EOL
EOL span ageing was added for each connector "to be fused" span
=> now, EOL ageing is only added to the last span of the fused spans
section.

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
Jean-Luc Auge
af75569eb8 clarify code on connector and EOL default loss
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-18 12:02:57 +02:00
James
aebf2ff270 Merge pull request #216 from Orange-OpenSource/automatic_design_test
Automatic design test
2019-04-16 12:21:29 -04:00
EstherLerouzic
3bcdeda3e9 Small fix
Coordinates in files changed during the autodesign compared
to previous autodesign version: autodesign_expeted test
files updated

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:55:19 +01:00
EstherLerouzic
7433667243 Update of README.rst wrt Roadm equalization feature and addresses issue #210
TODO update README with dual stage Edfa type_def
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:40 +01:00
EstherLerouzic
d7c009167f add units on the stdout answer given by transmission_main_example.py (issue 201)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:40 +01:00
EstherLerouzic
79f198d6fe reduce the number of test files
- use the same file to test different configuration: with and without Eqpt sheet defined,
   add the abcdfgh topo for disjunction tests in the same file
 - change the name of files to avoid mixing with examples/meshTopologyExampleV2.xxx

TODO: test if examples/ files pass for both path_requests_run and trasmission_main_example

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:40 +01:00
EstherLerouzic
315a12b9df Update parser test files in order wrt novel road_equalization branch
- update eqpt_config.json to power-mode : false to take into account gain target
    of the eqpt sheet
  - correct eqpt-config.json with new parameters
  - add gain target in meshTopolgyExampleV2Eqpt.xls
  - correct test_parser.py test function names
  - suppress Exceltestfile
  - change power_mode in test_parser to True in eqpt_config.json for all tests
    except for parser: changed in the function to False to enable
    Eqpt files gain target reading

Next: reduce the number of test files and include examples/ files in the tests

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:40 +01:00
EstherLerouzic
e14d145f2c Adding a test on autodesign self consistency
tests it the autodesign applied on an autodesigned file gives exactly the same result

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:29 +01:00
EstherLerouzic
f839da39f0 adding a test on autodesign verfication
- tests if networ autodesign does not change some reference files
- changed the meshTopologyExampleV2 files to account for newest features:
  empty columns in mode, power, nb channels, disjuntion examples ...
- added a tsp type with more modes for diversification of tests

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-04-09 17:13:29 +01:00
James
45ca7a63ed Merge pull request #222 from Orange-OpenSource/edfa_model_doc
Edfa model doc
2019-04-09 11:01:42 -04:00
James
5d187255ae Merge pull request #215 from Orange-OpenSource/roadm_equalization
Roadm equalization
2019-04-09 11:00:48 -04:00
Jean-Luc Auge
7adf6aed59 doc file description of all amplifier models!!
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-08 15:03:30 +02:00
Jean-Luc Auge
b22a7a0234 refactor amplifier build build_OA_json.py
update the advanced json model generator

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-05 16:06:08 +02:00
Jean-Luc Auge
8805723114 clean edfa_model repo
remove unused obsolete files

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-05 10:59:03 +02:00
Jean-Luc Auge
88db4358f5 import Juniper amp model example
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-04 17:25:53 +02:00
Jean-Luc Auge
a5d9685caf modify std_medium_gain_advanced_config.json
reflect interpolation code changes

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-04 17:17:47 +02:00
Jean-Luc Auge
94ff8e6beb Advanced amplifier model improvement
-add f_min & f_max frequency definition in amplifier json
-improve interpolation algorithm to support length differences between the spectrum information and the amplifier ripple and dgt frequency definition

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-04 17:14:25 +02:00
Jean-Luc Auge
f3400d9bc1 update eqpt_config.json
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:51:12 +02:00
Jean-Luc Auge
6a6591e41d Add a warning message when attributes are missing in eqpt_config.json
new code provides default values when attribute is missing in the
eqpt_config.json, which can cause unexpected beahaviour
=> provide a warning message when a missing attribute is set to a
default value.

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:51:12 +02:00
Jean-Luc Auge
a3a53f3b06 info message in gain mode
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:51:12 +02:00
Jean-Luc Auge
2f39abfdb8 power reduction warning message in auto-design
warning message when extended gain range or power requirements cannot be met
and power is reduced

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:09:45 +02:00
Jean-Luc Auge
92239d66fc pass tests
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:09:45 +02:00
Jean-Luc Auge
178813806f update eqpt_config and example files
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:09:45 +02:00
Jean-Luc Auge
265dbffc53 Roadm channels power equalization
set individual channel powers wrto the target reference power
tilt and ripple is eliminated after the Roadm

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
a67a08a4d0 Power reference tracking to roadm target power
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
6d15f55304 bug fix : Roadms & Spans vs Roadm & Span
discrepancy between topology and eqpt_config json descriptions

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
0dbcd1f265 manage operational vs calculated values
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
a0f6380f90 improve power mode and gain mode
power mode no longer reads operational_gain:
	now reads operational.dp_db
	or calculate optimum dp_db from next span loss
gain_mode:
	reads operational.gain_target
	or use gain_from_dp

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

squash to dp

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
ab69cf5bf4 improve Edfa class element
manage default values

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

squash to improve edfa element

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
3e55a5526a improve power saturation handling
fix issue with low power level (after very long spans) where signal gian
was depleted

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
b6216bb701 operational amplifier delta P target power support
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
fa3ea3aaa7 convert headers restriction management
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
7b56e3a6c3 bug fix in dual stage NF calculation
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
00ad542a11 auto-design: relax min_gain restriction
tolerate 3dB below min gain in amplifier selection
this is to allow selection of high power amps even if they are below min
gain

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
c3e00eea2c manage inter-stage padding for dual stage typedef
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
5c8dd911e3 reduce power when the extended_gain_range limit is reached
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
407fd62da5 improve auto_design algorithm for amplifier selection
remove gain_max limit and use gain+power limit selection instead

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
c92f7ca0d8 parametrization of amplifier extended_gain_range
used for amplifier selection in auto-design
but it is possible to manually use and set an amplifier beyond its
extended_gain_range

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
676901e113 remove hybrid type_def implementation
replaced by dual_stage type_def

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
c099b53a03 parametrization of the max_fiber_lineic_loss_for_raman
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
fd065e4e7c raman tag in eqpt description
manage raman restrictions

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
fd97527561 raman restrictions
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
7b9647a063 dual stage
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
bf943f1347 insert a raman hybrid example in eqpt_config.json
auto_design over meshTopologyToy.xls will show raman use
run:
python transmission_main.py meshTopologyToy.xls

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:08:34 +02:00
Jean-Luc Auge
dbff610d77 Improve auto-design amplifier selection algorithm
support min gain for raman auto-design:
raman amplification is only considered for a span if its gain > specified
min gain of the raman hybrid in eqpt_config

auto-design was checked succesfully when enabling a mix of 6 possible different
amplifiers: low gain, medium gain, hybrid ramans, high power and high
gain amplifiers. They were picked as expected.

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:05:45 +02:00
Jean-Luc Auge
46aae9486e update raman model to work in auto-design
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:05:45 +02:00
Jean-Luc Auge
70066de390 remove namedtuple class structure for network elements
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:05:45 +02:00
Jean-Luc Auge
106af9d444 hybrid raman/edfa model
quick & dirty 1st implementation:
-fixed raman nf and gain,
-no tilt, no ripple,
-no nonlinear contribution,
-no pump power reduction from egress con loss

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-04-03 14:05:45 +02:00
James Powell
f0be267f9f Merge branch 'develop' 2019-03-05 19:31:28 -05:00
James Powell
edbaec7265 small fixes 2019-03-05 19:11:17 -05:00
James Powell
f4537a538b interpolate fiber positions 2019-03-05 01:07:05 -05:00
James Powell
d6eb6f33d2 fixed ordering, display of SI details 2019-03-04 17:05:56 -05:00
James Powell
7a139c261a removed redundant files 2019-03-04 17:01:34 -05:00
James Powell
2e7aa213ed add units to OSNR 2019-03-04 17:00:41 -05:00
James Powell
d9344287e4 tweaks for OFC 2019-03-04 13:58:09 -05:00
James Powell
b9768a81e9 update README for v1.2 2019-03-04 12:50:35 -05:00
James Powell
3418c07512 changes for OFC demo 2019-03-04 12:35:12 -05:00
James Powell
5f8621c224 changes for OFC demo 2019-03-04 12:32:26 -05:00
James
c05f3555a3 Merge pull request #213 from Orange-OpenSource/features-description
more precise description on README for v1.1
2019-03-04 11:18:08 -05:00
EstherLerouzic
f21827395b update of the descriptioon json structure
- adding the different types of amplifiers

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-02-27 10:26:15 +00:00
EstherLerouzic
894c7bb17a update Excel_userguide.rst wrt convert-service_sheet issue #214
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-02-26 14:33:39 +00:00
Esther LE ROUZIC
2028cfae4d more precise description on README for v1.
Signed-off-by: Esther LE ROUZIC <esther.lerouzic@orange.com>
2019-02-04 09:59:58 +01:00
James Powell
2064f65c04 Merge branch 'develop' 2019-01-30 16:00:38 -05:00
James Powell
1de4a4eaa2 README table fix 2019-01-30 15:59:27 -05:00
James
06ff45d0c2 Merge pull request #208 from Telecominfraproject/develop
Merge Develop to Master for v1.1 Release
2019-01-30 15:52:50 -05:00
James
81474e252e Merge branch 'master' into develop 2019-01-30 15:47:50 -05:00
James
0069265905 Merge pull request #207 from Telecominfraproject/dutc-patch-1
Update README.rst
2019-01-30 15:45:36 -05:00
James
dec15f6797 Update README.rst 2019-01-30 15:40:38 -05:00
Gert Grammel
f6041cd844 Update Contributors
Adding Contributors to the list
2019-01-30 19:35:42 +00:00
James
ec20d3981b Merge pull request #200 from ojnas/fix-readme-coronet
Correct README
2019-01-28 10:39:38 -05:00
James
08a867ef5a Merge pull request #199 from Orange-OpenSource/sys_margins
Sys margins
2019-01-28 10:39:19 -05:00
Jean-Luc Auge
76c8296a5d remove osnr_nli update when adding add_drop_osnr or tx_osnr
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 14:52:02 +01:00
jonas
f65059dd7f correct README 2019-01-23 14:16:43 +01:00
Jean-Luc Auge
e23fef3f64 update and pass tests
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 10:31:42 +01:00
Jean-Luc Auge
30234f913c system margins implementation
-read sys_margins in equipment['SI']
-add sys_margins to all Transceivers OSNR for path feasibility check
(path_request_run only)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 10:31:42 +01:00
Jean-Luc Auge
2dd017bddc code speed improvement (-30% computation time)
remove path propagation for each mode
path propogation is only done for each baud rate

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 10:31:42 +01:00
Jean-Luc Auge
b25a298087 NEW add_drop OSNR in Roadms model
add_drop_osnr contribution is added at the end of the transmission
propagation, at the same time as Tx_osnr

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 10:31:31 +01:00
Jean-Luc Auge
8635a7c182 Code fix: non cumulative osnr penalties
add Tx_osnr or add_drop_osnr on raw snr values to avoid cumulated adding

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

bug fix: forgot to add add_drop_osnr with tx_osnr

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-23 10:29:30 +01:00
Jean-Luc Auge
46d25df241 add Tx_osnr at the end of the transmission
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-22 18:05:19 +01:00
Jean-Luc Auge
0338ccb08f restore SI basic parameters f_min f_max & spacing
remove unnecessary SI parameters in eqpt config.json:
-rx osnr
-bit rate
-min spacing
-cost
remove nb_channel parameter when calling SI

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-22 18:05:19 +01:00
James
8899a575b8 Merge pull request #144 from ojnas/fix-crash-source-dest
Change source and destination logic in transmission_main_example.py
2019-01-22 10:12:58 -05:00
James
b4407b1ff3 Merge branch 'develop' into fix-crash-source-dest 2019-01-22 10:12:18 -05:00
James
75f0aebe8f Merge pull request #193 from Orange-OpenSource/carrier_probe
Carrier probe
2019-01-22 10:09:47 -05:00
James
5a2dd53636 Merge pull request #195 from Orange-OpenSource/name_matching
Name matching dictionary
2019-01-22 10:09:16 -05:00
James
3555154c3e Merge pull request #194 from Orange-OpenSource/xls_parser_enhance
Xls parser enhance
2019-01-22 10:08:51 -05:00
James
6c92c282e7 Merge pull request #192 from Orange-OpenSource/roadms_openroadm
openroadm noise models and ROADM improvement
2019-01-22 10:07:21 -05:00
James
bd1847e5ba Merge branch 'develop' into roadms_openroadm 2019-01-22 10:04:45 -05:00
James
38727d6203 Merge pull request #188 from Orange-OpenSource/tsp_min_spacing
Tsp min spacing
2019-01-22 10:04:03 -05:00
James
7e6d557d01 Merge pull request #187 from Orange-OpenSource/fourth_correction
Fourth correction
2019-01-22 10:03:25 -05:00
James
5d3ce91839 Merge branch 'develop' into fourth_correction 2019-01-22 10:00:54 -05:00
James
1f34e3005e Merge pull request #186 from Orange-OpenSource/tx_osnr_bug_correction
bug fix due to tx_osnr add
2019-01-22 09:59:57 -05:00
James
8e27437086 Merge branch 'develop' into tx_osnr_bug_correction 2019-01-22 09:56:19 -05:00
James
b4cbe8029e Merge pull request #191 from davidBoertjes/davidBoertjes-node-coord-fix
fix for issue #190
2019-01-15 11:33:35 -05:00
James
92f3fd2063 Update node.py 2019-01-15 11:33:17 -05:00
James
1112b331ef Merge pull request #183 from Orange-OpenSource/second_corrections
Fixes concerning service sheet reading
2019-01-15 11:32:32 -05:00
James
ec34e84a3a Merge pull request #182 from Orange-OpenSource/corrections
Corrections
2019-01-15 11:31:48 -05:00
Jean-Luc Auge
bc9eee326a parametrization of the name matching dictionary
-names argument

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 17:17:58 +01:00
Jean-Luc Auge
c9106c3a6f enhance name matching dictionary
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 17:17:45 +01:00
Jean-Luc Auge
fa949f977a add name find closest match in xls parser convert.py
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

output node name gestalt matching to a json

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 17:17:33 +01:00
Jean-Luc Auge
4c2d61bb9b update test files
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 17:14:17 +01:00
Jean-Luc Auge
ec7b14da8c convert.py xls to json parser class enhancement
improve default values handling
update xls examples with east/west headers

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 17:11:07 +01:00
Jean-Luc Auge
771af4991c carrier probe point for Edfa and Fiber elements
save and be able to retrieve the carrier information processed by Edfa
and Fiber elements:
@ input & output of the element
- channel power: ase, nli, signal and total

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:20:45 +01:00
Jean-Luc Auge
a08ce9ecb7 gain mode fix
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:16:16 +01:00
Jean-Luc Auge
4d84a4f528 replace egress/ingress by east/west denomination
xls parser

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:16:16 +01:00
Jean-Luc Auge
5d92baf35e Add link duplication check for the xls parser
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

Check and remove duplicate links

a warning is issued when a duplicate link is dicovered
the execution is paused for the user ot see the warning
the duplicate link is removed and the execution resumed

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:16:16 +01:00
Jean-Luc Auge
ac5171e95e Improve Excel Node parser
-read headers in any order
-possible hierarchical implementation

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:16:16 +01:00
Jean-Luc Auge
697ac311fe Improve excel link sheet parser
-implement hierarchical headers
-can read headers in any order

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-14 12:16:16 +01:00
Jean-Luc Auge
c22d1173af update test eqpt_config
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-11 12:28:39 +01:00
Jean-Luc Auge
c0cc5fa9fd Improve ROADM loss calculation in power mode
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-11 12:28:39 +01:00
Jean-Luc Auge
4bd9a9cdda final openroadm models implementation
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2019-01-11 12:26:48 +01:00
EstherLerouzic
63f8139dbc Adding min_spacing parameter in transponder mode library
with this feature, spacing is checked with respect to eqpt library
instead of hard coded values in equipment +
mode selection is based on the minspacing instead of a coef or margin on baudrate value
some sanity check are introduced:
 - request spacing must be greater than min spacing
 - tsp baudrate must be smaller than min spacing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 11:11:56 +00:00
EstherLerouzic
be731a5977 cosmetic changes
add colors on messages to highlight them
add messages explaining program steps
update readme and excel user guide with latest fetures

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 11:09:19 +00:00
EstherLerouzic
dd4ce4cea4 small fixes on stdout
- indicate when a mode is selected if the selected path does not pass
  + some reformatting of columns

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 11:09:19 +00:00
EstherLerouzic
2548a2eee8 small fix on test eqpt_config file
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 11:09:19 +00:00
EstherLerouzic
03948d6785 Fixes concerning service sheet reading
- indicate which request has incorrect tsp type or mode
- correct wrong reading from excel : if string values contain only a number
  xlrd interprets as a float eg 1 -> 1.0  . A function is used to correct this
  when filling the Request class

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 10:41:16 +00:00
EstherLerouzic
4c1c17eea6 adding some exception handling to help user debugging
exception handling add some info on which element (uid) is causing a problem
this can help the user to identify where he must correct the input files

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 10:41:16 +00:00
EstherLerouzic
b258d22d25 add a limit cutoff for path search
all_simple_path search leads to a very long time process in case of large network
a cutoff parameters has been added to avoid this. Value needs to be parametrized:
right now correspond to my experience

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-01-11 10:41:16 +00:00
EstherLerouzic
aef43e6bca Corrections of some bugs
- output file of -o option in path_requests_run.py was not correctly
  handled in case of path indirection eg ../../bar.foo
- fiber constraint is not correctly handled in case of several parrallel directions
  of same fiber name. remove if from the set of constraint till problem is not solved
- loose_list was not correctly indexed in request.py. assume the first element attribute (except
  of the transceiver) applies for the whole list. This will need further corrections
- handle the case when path.snr is none in the optimization of mode process
2019-01-11 10:41:11 +00:00
David Boertjes
3d7362743d Update node.py
Fix return value error in *coords* property which was intended to build a tuple from two values for latitude and longitude.
2019-01-10 13:15:59 -05:00
EstherLerouzic
96f3d5a805 Adding min_spacing parameter in transponder mode library
with this feature, spacing is checked with respect to eqpt library
instead of hard coded values in equipment +
mode selection is based on the minspacing instead of a coef or margin on baudrate value
some sanity check are introduced:
 - request spacing must be greater than min spacing
 - tsp baudrate must be smaller than min spacing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-21 16:57:08 +00:00
EstherLerouzic
2df500e027 cosmetic changes
add colors on messages to highlight them
add messages explaining program steps
update readme and excel user guide with latest fetures

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-19 18:24:22 +00:00
EstherLerouzic
346f24022a small fixes on stdout
- indicate when a mode is selected if the selected path does not pass
  + some reformatting of columns

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-19 18:21:51 +00:00
EstherLerouzic
cb45c7ef16 new test to check all combination of service specifications
add combination of [mode, pow, nb channel] filled or empty
leading to feasible path or not, and check the propagate and propagate_and_optimize_mode
functions work as expected on a reference toy example

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-19 18:17:29 +00:00
EstherLerouzic
9cfb57dc4b bug fix due to tx_osnr add
tx_osnr was impacting automatic mode selection feature : this commit solves the error
+ add a novel version of the excel example with an empty mode cases

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-19 16:32:54 +00:00
EstherLerouzic
020d852758 adding some exception handling to help user debugging
exception handling add some info on which element (uid) is causing a problem
this can help the user to identify where he must correct the input files

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-19 10:49:31 +00:00
James
8d97fcd735 Merge pull request #177 from ojnas/add-tx-osnr
Add Tx OSNR
2018-12-18 11:57:35 -05:00
James
097fe3114e Merge pull request #156 from ojnas/power-range
improve power range handling - revised version
2018-12-18 11:56:47 -05:00
James
5e0fd265ff Merge pull request #180 from ojnas/fix-osnr-calculation
Fix OSNR calculation when ASE or NLI is zero
2018-12-18 11:55:43 -05:00
EstherLerouzic
6af137a085 Fixes concerning service sheet reading
- indicate which request has incorrect tsp type or mode
- correct wrong reading from excel : if string values contain only a number
  xlrd interprets as a float eg 1 -> 1.0  . A function is used to correct this
  when filling the Request class

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-17 09:27:05 +00:00
EstherLerouzic
3cdc8511a8 add a limit cutoff for path search
all_simple_path search leads to a very long time process in case of large network
a cutoff parameters has been added to avoid this. Value needs to be parametrized:
right now correspond to my experience

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-12-13 11:37:26 +00:00
EstherLerouzic
2d515eea4c Corrections of some bugs
- output file of -o option in path_requests_run.py was not correctly
  handled in case of path indirection eg ../../bar.foo
- fiber constraint is not correctly handled in case of several parrallel directions
  of same fiber name. remove if from the set of constraint till problem is not solved
- loose_list was not correctly indexed in request.py. assume the first element attribute (except
  of the transceiver) applies for the whole list. This will need further corrections
- handle the case when path.snr is none in the optimization of mode process
2018-12-13 11:28:42 +00:00
Jonas Mårtensson
61289119cb fix OSNR calculation when ASE or NLI is zero 2018-12-13 11:07:16 +01:00
Jonas Mårtensson
b6bc995e40 fix OSNR test 2018-12-11 21:29:17 +01:00
Jonas Mårtensson
b0bb41bac6 fix eqpt_config test file 2018-12-11 20:48:53 +01:00
Jonas Mårtensson
0927a92652 fix amplifier test 2018-12-11 17:25:26 +01:00
Jonas Mårtensson
f51061d650 Add support for Tx OSNR. 2018-12-11 16:59:05 +01:00
James
74314f00ca Merge pull request #175 from MiquelGA/develop
Few minor details and typos in README
2018-12-11 10:07:09 -05:00
miquelgarr
81c5ef4a23 Adding "transmission_main_example.py -h" 2018-11-30 12:16:22 +01:00
miquelgarr
72e329b08e Now applying corrections to the README 2018-11-30 10:28:41 +01:00
miquelgarr
50f884663f Minor typo changes in README 2018-11-30 10:25:17 +01:00
miquelgarr
0d81eb4b29 Minor typos solved in README 2018-11-30 09:59:51 +01:00
James
978a9407fa Merge pull request #171 from szhu3210/patch-1
Fix a typo of log message in transmission_main_example.py
2018-11-27 12:22:01 -05:00
James
2c3b74cdc1 Merge pull request #168 from Orange-OpenSource/service_aggregation_clean_rebased
Service aggregation clean rebased
2018-11-27 11:31:41 -05:00
Shengxiang Zhu (Troy)
448e0f54be Update transmission_main_example.py
Seems a typo.
2018-11-23 17:32:58 -07:00
EstherLerouzic
17f638e991 New mesh example with correct inputs + small fix
small fix: correct wrong numbering of request in json and csv
           output when -o is used ; correct example files

signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:56:00 +00:00
EstherLerouzic
b78d3d8eda small fixes
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:20:25 +00:00
EstherLerouzic
02a7e467e2 Major correction on mode optimization behaviour + small fixes
- this version handles special cases: if no baudrate satisfies the spacing
  , if no mode satisfies the OSNR threshold from transponders
- template of service corrected wih the novel path_bandwidth
- some ideas TODO added for testing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:20:25 +00:00
EstherLerouzic
15304890f5 Corrections related to tests and cost feature
- adding cost in every config files including tests files,
- correcting wrong usage reactions (if value is not correct)
- correction of orthograph

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:20:19 +00:00
EstherLerouzic
9ea96e431c Test disjunction file corrected to include novel path_bandwidth parameter
Correction of data file for test_parser.py + adding a test on path non looping

- service json files now include path_nadwidth value
- previous constrained routing did not ensure loop free path. This has been
  corrected in A test has been added to avoid loosing this feature in future
  versions.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:16:23 +00:00
EstherLerouzic
7b3bfea614 correction on aggregation to support disjunction feature
- correction of json to csv function
- small correction on disjunction format to avoid double (double crash the program)
- adding time stamp to measure perf
- demands with all the same disjunction can be aggregated
- a new set of disjunction list is created to avoid inconsistancy
  with the aggregated request-ids

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:16:17 +00:00
Jean-Luc Auge
d68637c2c8 clean eqpt_config.json, add low noise openroadm amp
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-11-21 09:14:17 +00:00
Jean-Luc Auge
f009306030 small fix
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-11-21 09:12:58 +00:00
Jean-Luc Auge
ca97cba18b display channel count information when running transmission_main
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-11-21 09:12:58 +00:00
Jean-Luc Auge
a46c8c5398 OpenRoadm amplifier model!
nf = f(Pin)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-11-21 09:12:58 +00:00
EstherLerouzic
88c2e2bd70 adding the example file used for disjunction testing
adding the counting feature on standard output
adding tsp nb in csv + small bug fixes
small fixes to improve stdout and csv printing
small bug fixes on service aggregation and automatic correction of names

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:12:52 +00:00
EstherLerouzic
1bbcee8715 First version of mode optimization
if mode is not given, the program automatically choses the first mode of the tsp
that has the highest bitrate and baudrate
- propagation for each loop on baudrate, no proppag in the loop on the mode
  (same propoagation applies for identical baudrate)
- starts with the highest baudrate
- Make power and nb-channel optional fields
  - power uses defaulf power in SI
  - nb channel computed based on min max frequency and spacing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:06:38 +00:00
EstherLerouzic
39a8fa3335 Adding aggregation feature
- add bitrate in the json service file and enable to support both with and without bitrate
- change mode to optional
- add an aggregation function that groups identical demands (except for id and bitrate)
  TODO : check which mode can satisfy the request or how many transponders  -> going towards
  dimensionning
- Correcting some bug on loose parameter: hop-type attribute was forced to be always 'loose'
  -> now reads the excel entry
- service sheets checks on header if field names are corrects, but any order is
  now feasible.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-21 09:06:32 +00:00
James
fe067e5367 Merge pull request #167 from ojnas/verbosity
handling of --verbose
2018-11-20 09:57:26 -05:00
Jonas Mårtensson
5efbd17829 add help text 2018-11-20 15:27:54 +01:00
Jonas Mårtensson
fa3e54a747 change handling of verbosity argument 2018-11-20 14:57:28 +01:00
James
603beccb01 Merge pull request #151 from ojnas/gain-mode-fix
gain-mode fix
2018-11-19 18:18:35 -05:00
James
4b20afd599 Merge pull request #165 from ojnas/fix-broken-links
fix broken links in readme
2018-11-19 18:16:46 -05:00
Jonas Mårtensson
adbe283c83 fix broken links in readme 2018-11-19 13:03:32 +01:00
James
1908d7e29a Merge pull request #161 from Orange-OpenSource/correcting_example
correcting example file
2018-11-16 11:08:38 -05:00
James
7c6e16cfbc Merge pull request #164 from ojnas/fix-encoding
fix utf-8 encoding
2018-11-16 11:07:27 -05:00
Jonas Mårtensson
1480d23088 fix utf-8 encoding 2018-11-16 13:38:07 +01:00
EstherLerouzic
4be3522209 change for meshTopologyToy.xls
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-16 11:14:37 +00:00
EstherLerouzic
5381e0300f correcting example file
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-11-16 11:04:52 +00:00
James
cde822ebf8 Merge pull request #157 from ojnas/missing-imports
add a couple of missing imports
2018-11-15 12:39:31 -05:00
Jonas Mårtensson
72d3525da1 add a couple of missing imports 2018-11-14 15:24:25 +01:00
Jonas Mårtensson
dc867fa051 improve power range handling 2018-11-14 09:08:14 +01:00
James
eaf3fcade8 Merge pull request #153 from dutc/develop
fix changed spelling & misspellings in meshTopolgyExampleV2_Services.json for "synchronize/synchronization"
2018-11-13 11:49:01 -05:00
James Powell
3df270e4ac fix changed spelling & misspellings in meshTopolgyExampleV2_Services.json for "synchronize/synchronization" 2018-11-13 10:24:15 -05:00
Jonas Mårtensson
7937392dfc fix bug in gain mode 2018-11-13 13:33:21 +01:00
James
9c1c0f8d1f Merge pull request #137 from Orange-OpenSource/path_disjunction
Path disjunction
2018-11-12 10:11:08 -05:00
James
ad2ab0d164 Merge pull request #143 from ojnas/coronet-all-nodes-roadm
Explicitly set all nodes in CORONET example to ROADM
2018-11-12 10:10:45 -05:00
James
c4bed94eb0 Merge branch 'develop' into coronet-all-nodes-roadm 2018-11-12 10:10:23 -05:00
James
edc8eb55de Merge pull request #145 from ojnas/split-fiber
Some modifications in split_fiber function
2018-11-12 10:09:15 -05:00
Jonas Mårtensson
f4f9868381 Change span numbering in split_fiber. 2018-11-05 17:43:14 +01:00
Jonas Mårtensson
f103bebe05 Some modifications in split_fiber function. 2018-11-05 17:13:03 +01:00
Jonas Mårtensson
c168af46bc Update transmission_main_example.py 2018-10-30 17:37:50 +01:00
EstherLerouzic
7d82248903 Small fix
obsolete text not removed in the message made an error: I removed them

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-30 16:36:44 +00:00
James Powell
e6cb269754 Merge branch 'master' into develop 2018-10-30 12:05:44 -04:00
James
ac8a96398a Update issue templates 2018-10-30 12:03:36 -04:00
EstherLerouzic
86c79c7c60 Merge branch 'develop' into path_disjunction solving conflict
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-30 16:01:06 +00:00
James Powell
0c47b3f3ea small fix in case spacing_list is not sorted 2018-10-30 12:00:47 -04:00
James
b2b500c5dc Merge pull request #141 from ojnas/automatic-spacing
Simplify automatic spacing
2018-10-30 11:59:20 -04:00
ojnas
efc8468268 Set type of all nodes to ROADM in CORONET Glabal. 2018-10-30 09:48:59 +01:00
ojnas
205baebd48 Merge remote-tracking branch 'upstream/develop' into develop 2018-10-30 09:37:27 +01:00
James
af9ba2750d Update examples/path_requests_run.py
Co-Authored-By: EstherLerouzic <EstherLerouzic@users.noreply.github.com>
2018-10-30 09:27:52 +01:00
James
e04afdbe4c Update examples/path_requests_run.py
Co-Authored-By: EstherLerouzic <EstherLerouzic@users.noreply.github.com>
2018-10-30 09:27:12 +01:00
James
e94fd9590e Merge pull request #142 from dutc/develop
Fix encoding issues
2018-10-29 20:02:28 -04:00
James Powell
4f4f05abdf allow JSON to encode UTF-8 2018-10-29 11:17:13 -04:00
James Powell
bcf93e1d9f enforce utf-8 encoding for reading/writing JSON 2018-10-29 11:14:31 -04:00
ojnas
48198bdd89 Simplify automatic spacing. 2018-10-28 08:11:22 +01:00
EstherLerouzic
fbb4f3e5dd small fixes to comply with test files
- re-changing to to \u2192 in convert.py to comply with test files
- changing route constraint in data file : the constraint had a too ambiguous
  naming in the service file
- test on None was incorrect in convert_sheet

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-26 10:43:27 +01:00
James
cefd1cf030 Merge pull request #136 from ojnas/main-source-dest
source and destination handling i transmission main example
2018-10-25 10:47:00 -04:00
James
f8fa544e31 Merge pull request #135 from ojnas/add-fiber-types
Update eqpt_config.json
2018-10-25 10:46:02 -04:00
EstherLerouzic
27885a4cbc Correction of route constraint to support any type of node
correct loose or strict constraint in the new function correct_route_list
correct test to find the name with existing uids
remove the \u2192 char that was not well supported with excel reading

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-25 15:42:09 +01:00
EstherLerouzic
185a62958f adding safe check on baudrate and spacing
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-25 15:42:02 +01:00
EstherLerouzic
1ba748f2a4 Changing optical power and nb channels as optional input for requests
- normal way is usually to apply the design optical power for all channels.
  This change uses the default power (same power as used for design) but enables to
  force an arbitrary power if needed. TODO : introduce spectral power density
  to apply power depending on baudrate.
- definition of min max frequency and spacing define the nb of channels:
  uses min max frequencies and spacing to determine nb-channels. It is possible
  to force a different spacing for the request. TODO: check that the value is
  consistant with baudrate and min max values.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-25 15:36:53 +01:00
EstherLerouzic
90a75a9b3d Removing the duplicate convert_service_sheet.py in examples directory
service_sheet.py contains exactly the same function: I suggest to remove this file
to clean the examples directory

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-25 15:36:36 +01:00
ojnas
215295efb1 changed source and destination handling 2018-10-25 15:25:28 +02:00
Jonas Mårtensson
2413bd9e0d Update eqpt_config.json
Add fiber types.
2018-10-25 11:31:39 +02:00
James
356ae650fd Update issue templates 2018-10-23 17:09:56 -04:00
James
2444c24545 Update issue templates 2018-10-23 17:09:10 -04:00
James
7727708a3a Merge pull request #131 from ojnas/equipment-refactoring
Equipment refactoring
2018-10-23 14:15:39 -04:00
Jonas Mårtensson
0bfacd84f4 Update equipment.py 2018-10-23 17:16:51 +02:00
Jonas Mårtensson
b271c1ca3c Update equipment.py 2018-10-23 17:13:08 +02:00
EstherLerouzic
75660febc1 Create test_disjunction
- create some test to check that produced paths are really disjoint
- add some TODOs on optical power for requests computation

First check of disjunction feature added to tests

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-23 11:53:57 +01:00
EstherLerouzic
13aaa174e1 Adding this toy example network for disjunction testing
1     1
     a----b-------c
     |    |       |
     |1   |0.5    |1
     |    |       |
     e----f---h---g
       1   0.5 0.5

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-23 09:30:20 +01:00
EstherLerouzic
d112c728fc Moving the disjunction function into core functions
and minor corrections

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-23 09:25:07 +01:00
EstherLerouzic
826af4a9fd Minor corrections of tests due to orthograph change
- "synchronisation" replaced with synchronization
- "synchonzation" replaced with synchronization

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:41:56 +01:00
EstherLerouzic
c9693d355f some cleaning and renaming
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:26:58 +01:00
EstherLerouzic
9f37cb8ce6 Corrections, cleanup and debugging
Added (partly):
- relax one constraint when no more path are available

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:25:52 +01:00
EstherLerouzic
d99e8ca565 routing constraint added on top of disjunction feature : step 4
select disjoint routes that satisfy route constraint

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:25:52 +01:00
EstherLerouzic
44312125ab Disjunction feature step 3
- select the first path that is satisfying disjunction
not finished:
- constraints on nodes not taken into account
- modification on the build network not added
- not clean, need some refactoring

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:25:49 +01:00
EstherLerouzic
7558721642 Disjunction feature step 2
- selection of disjoint path set for each synchronization vector

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:23:11 +01:00
EstherLerouzic
fb49f7fb5d Implementing disjunction in path_requests_run.py
- Implementing a conflict table for path disjoint choices
 - adding disjunction in the parser and json (read and write)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-10-22 16:23:00 +01:00
James
ee7f2c2f47 Merge pull request #129 from ojnas/minor-readme-fixes
Update README.rst
2018-10-22 10:19:04 -04:00
Jonas Mårtensson
833fe006af Update README.rst 2018-10-22 14:19:52 +02:00
James Powell
9be0607b2e add authors 2018-10-17 09:25:14 -04:00
James Powell
6dc3f2ffa6 small updates 2018-10-16 05:58:42 -04:00
James Powell
df7cbf0b76 small README fix 2018-10-16 05:50:26 -04:00
James
19b6378b1a Merge pull request #126 from dutc/develop
add tagged version table
2018-10-16 05:46:12 -04:00
James
29cb2b50a8 Merge pull request #125 from ojnas/ojnas-minor-fixes-edfa
Update elements.py
2018-10-16 05:43:36 -04:00
James Powell
5932c014a0 add tagged version table 2018-10-16 05:42:31 -04:00
Jonas Mårtensson
a3f75e9af0 Update elements.py
Minor fixes to Edfa class
2018-10-16 11:05:34 +02:00
James
9ed31e2c4e Merge pull request #124 from dutc/develop
small fixes
2018-10-16 04:51:50 -04:00
James Powell
8599f63fbf remove redundant directory 2018-10-16 04:51:14 -04:00
James Powell
07de78cb05 small fixes 2018-10-16 04:50:33 -04:00
James Powell
f764bbe080 small changes 2018-10-16 04:48:22 -04:00
James Powell
7aa343b767 small fixes 2018-10-16 04:43:25 -04:00
James Powell
69198779a7 Merge branch 'develop' 2018-10-15 22:44:39 -04:00
James
3088032ad8 Merge pull request #123 from dutc/develop
Readme fixes
2018-10-15 22:37:22 -04:00
James
dfaa12598d Merge pull request #121 from dutc/master
Fixes
2018-10-14 20:59:05 -04:00
James Powell
1c724cdc6c Merge branch 'master' into develop 2018-10-14 20:58:42 -04:00
James
b59423fb01 Merge pull request #120 from dutc/develop
Develop
2018-10-14 20:58:02 -04:00
James
96bceed102 Merge pull request #119 from dutc/fixes
Fixes
2018-10-14 20:55:15 -04:00
James Powell
f8e146b9b9 duplicate service sheet to simplify structure 2018-10-14 20:55:25 -04:00
James Powell
c54ddb644a fix headers 2018-10-14 20:55:12 -04:00
James Powell
2be616411f fix headers 2018-10-14 20:55:00 -04:00
James Powell
7415744807 small formatting 2018-10-14 20:54:48 -04:00
James Powell
23905a90f4 strip whitespace 2018-10-14 20:54:35 -04:00
James Powell
83444b329e strip whitespace 2018-10-14 20:54:22 -04:00
James Powell
9898dc85a9 remove notebook and binder 2018-10-14 20:54:10 -04:00
James Powell
661287f600 perms 2018-10-14 20:53:56 -04:00
James Powell
353c6a77e9 make package 2018-10-14 20:53:41 -04:00
James Powell
8c006fec3f README fixes 2018-10-10 13:37:49 -04:00
James Powell
5896b7ce6a README fixes 2018-10-10 13:25:56 -04:00
James Powell
f831f6cb2c README fixes 2018-10-10 13:25:12 -04:00
James Powell
b505a1ae01 README fixes 2018-10-10 13:23:47 -04:00
James
a0fac17c0e Merge pull request #110 from dutc/master
v0.9 release
2018-09-24 12:56:49 -07:00
James Powell
851f606fc0 Merge branch 'develop' 2018-09-24 15:55:09 -04:00
James
8940430ee0 Merge pull request #108 from Orange-OpenSource/develop
auto_design saving to json and README update
2018-09-24 12:51:38 -07:00
James
06d3927275 Merge branch 'develop' into develop 2018-09-24 12:49:45 -07:00
James
7cea13e90d Merge pull request #104 from Orange-OpenSource/path_resquest_rundemo
Path resquest rundemo
2018-09-24 12:47:12 -07:00
Jean-Luc Auge
de2a504078 README.rst update
Add auto_design and equipment library description!

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-09-18 17:41:54 +02:00
Jean-Luc Auge
df14b441a3 improve code clarity of auto_design power setting
>network>target_power

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-09-18 17:12:58 +02:00
Jean-Luc Auge
ab1391440f Fix bug in padding and EOL and Roadm power
take into account manual padding input, i.e. att_in defined in fiber params
add EOL even if con_out is not None
Force Roadm input power to 0 in power_mode even if the ingress amplifier
gain is not null

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-09-17 13:16:38 +02:00
Jean-Luc Auge
310d32dcea Generic write_csv in utils to save simulation results
tranmsission_main simulation results are saved to simulation_result.csv

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-09-13 14:30:33 +02:00
EstherLerouzic
f8cd822c92 update of test files with respect to the use of string id instead of integer id
update Excel_userguide.rst for the services uid: now accepts strings

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-09-12 11:24:55 +01:00
EstherLerouzic
3ade885e41 improving path_request_run robustness to wrong excel input
- adding a test to check that transponder types and modes are part of the eqpt library
- adding a formating on numeical value input for path request ids
- automatically printing the results into a csv file in addition to the json file
  and with the saime name+.csv
- printing results in a more pretty way for demo

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-09-11 11:09:28 +01:00
EstherLerouzic
008a88192c small update of toy topology file + comment of a proont in network.py
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-09-10 13:44:56 +01:00
EstherLerouzic
639e7f012c some improvement of std out printing, to ease user understanding
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-09-06 11:04:42 +01:00
James
c8ecc16648 Merge pull request #102 from Orange-OpenSource/userGuideImprovement
User guide improvement
2018-09-04 09:06:16 -07:00
Jean-Luc Auge
225fb1ec0c Save network to json after build (auto design)
-Promote incremental and iterative network design
-Automatic saving of a "network-name_auto_design.json" file
-Results of the autodesign (additional amplifiers, configuration) are
saved to this ..._auto_design.json file
-This file can then be used to run a new simulation just like a normal
json input file

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-09-04 09:34:57 +02:00
EstherLerouzic
c943e0e9b4 writing json results from path_requests_run.py into csv
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-09-03 15:10:32 +01:00
EstherLerouzic
3d8ac83fcc - Reference to path_requests_run.py added in README.rst file
- adding a program to convert json result file into a CSV file
- adding some more metrics in the results

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-08-31 17:16:48 +01:00
EstherLerouzic
06399ca8af Excel user guide modifications according to issue 95
- typos corrections
- explanations on cable id, create_eqt_sheet.py, dimensionning
- adding updates in user guide
- harmonisation of naming _eqpt instead of eqt in create_eqt_sheet.py
- improving standard output, removing pi ref power print on standard output

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-08-31 09:41:42 +01:00
James
8f8fc13ded Merge pull request #101 from Orange-OpenSource/auto_design
Auto design features
2018-08-30 08:27:57 -07:00
James
2b018cb9a5 Merge pull request #96 from Orange-OpenSource/develop
fixed gain amplifier model and default values implementation from eqpt_config json
2018-08-23 07:05:02 -07:00
James
9577f4c9a3 Update README.rst 2018-08-21 11:47:49 -04:00
James
771d98cc10 Update README.rst 2018-08-21 11:31:28 -04:00
Jean-Luc Auge
91e875d54c Update untary test_amplifier
test_amplifier.py now works in power mode, which is the reommended mode
for auto-design

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-10 11:36:33 +02:00
Jean-Luc Auge
da39f1489f power range fix for power sweep
align the behaviour of
	eqpt_config[SI].power_range_db
with
	eqpt_config[Spans].delta_power_range_db
so that the min and max excursions are taken into account (which was not the case
when using range or arange for the SI power excursion)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

power range sweep fix

fix behaviour when power range length = 0
update eqpt_config json default values

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

restore verbose off for power sweep

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-10 11:36:29 +02:00
Jean-Luc Auge
2940576681 Output VOA in EDFA implementation
new field 'out_voa_auto': true/false in eqpt_config json for elligible
amplifiers
if the field is set to true and the power_mode (in
eqpt_config[Spans]) is also set to true, a VOA value is calculated based
on available gain and power margins (EOL) to maximize amplifier gain and
improve its NF

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-08 17:52:53 +02:00
Jean-Luc Auge
a49c137b78 Improve Select_edfa algorithm
can pick up the best NF amplifier among several amplifiers
sharing the same gain and/or power requirements when these requirements
are not satisfied (used to take the highest gain or the highest power)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-08 15:11:59 +02:00
Jean-Luc Auge
87e748cd83 Remove nf calculation duplication
align edfa_nf calculation in equipment.py to _calc_nf in elements.py

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-08 14:56:19 +02:00
Jean-Luc Auge
94c2e332bb update amplifier tests
readd CORONET xls

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-08 14:29:01 +02:00
Jean-Luc Auge
1437b6010e Small fixes
code cleanning
	consistency wiht path request
	generator bug fix
	select_edfa readibility enhancement
	set roadm_loss before design in power mode

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-08 14:24:12 +02:00
Jean-Luc Auge
5fc203482d Amplifier selection with power constraints
amplifier selection in autodesign now takes into account
-required design gain
-required design power
-lowest NF among amplifiers satisfying the requirements

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-07 17:03:18 +02:00
Jean-Luc Auge
ff6d81b749 Power setting refactor
separate in the code:
	* add egress amplifier
	* vs setting the amplifier parameters
=> prepare improvments in select_edfa and target_power code
regroup all network optimization operations in network.py (remove from
transmission_main)

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-07 13:39:08 +02:00
Jean-Luc Auge
167e644bd0 Amplifier disable field
* toggle true/false in eqpt_config json file to allow the use of a given amplifier type in automatic design: eqpt_config[Edfa][allowed_for_design] : true/false
* automatic design is picking the best amplifier (gain, NF constraints)
among the available ones: if toggle to false, the type is deemed non
available. Only alternative before this feature was to remove the
amplifier from eqpt_config library.
* if set to false, the amplifier type can still be input in the network
topology file: it will be recongized.

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-02 12:57:09 +02:00
Jean-Luc Auge
d1c7489768 Automatic design Delta power / span
* automatic design (when amplifiers are missing from network topology
input) finds the optimum power difference between spans
* The range of this optimum power difference is defined in
eqpt_config[Spans][delta_power_range_db] = [min, max, step]

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-02 12:30:05 +02:00
Jean-Luc Auge
a783e165dd Add fixed gain amplifier unitary test
update pytest test_amplifier.py test file
correct test_network.json

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 15:01:10 +02:00
Jean-Luc Auge
45bdd82864 Add padding support for Fused spans
=>small span padding now looks at the total loss of all spans spliced
together instead of individual span losses
=>the att_in padding attenuator is placed at the input of the 1st span
(the 1st of the serie of spans spliced together with Fused ne)

squashed with the fix to implement att_in attenuation in Fiber.propagate

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

intermediate bug fix to be squashed

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 14:44:03 +02:00
Jean-Luc Auge
79c5cb6b78 PADDING of small spans
read [Spans][padding] in eqpt_config json file
    * add input attenuator on all fibers in the network with loss < padding
    * create new Fiber attribute: att_in, which is a fixed attenuator for
    padding purposes
    * define "padding" : 0 in eqpt_config will disable the feature
    effectively

    improve fiber splitting to take into account the padding target in the
    min fiber length and the target length is set accordingly to this min
    length

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 12:15:36 +02:00
Jean-Luc Auge
63ade5fdef Fix bug when splitting fiber in calculated length
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 12:13:08 +02:00
Jean-Luc Auge
853b8c7aa3 Code improvement: catch eqpt errors/remove outdated files
* Fiber or Edfa type_variety in the the network topology that are not
    defined in the eqpt_config json (for example Fiber ELEAF) no longer
    crash the code: instead there is a clear error message and the code exit
    gracioulsy
    * remove outdated network topology json files

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:26:51 +02:00
Jean-Luc Auge
0655fb60de read fiber con_in, con_out, EOL & max_length default values
read fiber spans default values from eqpt_config.json[Spans]
	=>if con_in/con_out is None or not defined in params
Rename connector_loss_in/out to con_in/out for consistency
update parser tests to accept the new con_in/out syntax and the new default
values

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:24:28 +02:00
Jean-Luc Auge
69b28e3508 Update and pass amplifier tests
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:18:50 +02:00
Jean-Luc Auge
5c16d9539f new fixed gain amp in nf_model
new type_def attribute for nf_model amplifiers
variable gain: requires nf_min & nf_max input
fixed_gain: requires nf0 input
	=> NF=nf0 in the [gain_min, gain_flatmax] range
	=> NF=nf0+pad if gain < gain_min: automatic input padding

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:18:39 +02:00
Jean-Luc Auge
e3acf02bde Error message handling and code exit
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:18:21 +02:00
Jean-Luc Auge
10268e82d4 Power sweep parametrization
=>  read power range [lower, upper, step] in eqpt_config.json
            do not display intermediate NE info if range > 1
=>  update eqpt_config.json data

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 11:02:57 +02:00
Jean-Luc Auge
bc44bf726a Fix bug in setting the Roadm egress reference power
* apply to power_mode only
* bug caused by the previous merge

restore default connector loss in excel to json parser (convert.py) to
0dB

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-08-01 10:59:00 +02:00
James
9839681bc0 Merge pull request #92 from Orange-OpenSource/fix-tests
Fix tests
2018-07-17 07:24:38 -07:00
James
beb292cb07 Merge pull request #91 from berahtlv/docs
network and equipment JSON descriptions added
2018-07-17 06:55:52 -07:00
EstherLerouzic
29c4134f60 Add roll of factor in transponders library
roll_off is a characteristic of a given mode of a transponder: it is now
integrated in the Transceiver library in eqpt_config.json. Default value is
provided in the SI field if no specific transponder type is input

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-16 18:11:02 +01:00
EstherLerouzic
a6157f328d Work contnued on fixing dutc merge with power feature of jla
- adding relevant call in path_request_run
- unifying variable name: baudrat -> baud_rate
- correcting test with the new power features
- unifying naming of variables n_ch-> nb_channel, sink -> destination (this
  follows the usual naming in yang path computation models)
- bug fix in novel trx_mode_params function , use of the function in
  path_requests_un
- TODO : place roll-off in the tsp lib instead of SI

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-16 17:43:56 +01:00
Jean-Luc Auge
c6432e2c28 restore power management in transmission main
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-13 18:15:21 +02:00
Jean-Luc Auge
4a756bf2a9 Read trx parameters and find channel spacing&count
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-13 13:45:05 +02:00
Roberts Miculens
b77cc5cd40 file extension changed 2018-07-10 21:53:14 +03:00
James Powell
9f94af6ff7 add test_propagation 2018-07-10 10:47:27 -04:00
James Powell
7aba40cd5b fixed merge conflicts; added default pref value for SI 2018-07-10 10:35:42 -04:00
James Powell
fb6c17c5ff fixed merge conflict 2018-07-10 10:27:24 -04:00
James Powell
4504d80c80 fixed merge conflict 2018-07-10 10:16:13 -04:00
James
8a8c8989cb Merge pull request #85 from dutc/improve-tests
Test Cleanup
2018-07-10 06:47:09 -07:00
Roberts Miculens
20731dbe4b network and equipment JSON descriptions added 2018-07-09 17:46:15 +03:00
EstherLerouzic
55393ca9eb implement the use of TSP f_min
- tsp fmin is used instead of default SI value

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-09 10:24:34 +01:00
EstherLerouzic
e9aa4d5601 Integration of Path_request_run functionalities into transmisson_main_example
creation of modular functions to be called
use of the same propagate function in both examples

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-06 17:32:56 +01:00
EstherLerouzic
6d49769df9 Refactor path_request_run and integration of functions to transmission
- use load_equipment instead of load_SI
- integrate the pathrequest class into transmission main

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-05 18:33:04 +01:00
Jean-Luc Auge
0333e9d094 update amplifier test
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 19:21:48 +02:00
Jean-Luc Auge
53bedca50a parametrize and format power mode
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 17:07:59 +02:00
EstherLerouzic
b9518ca987 Starting: adding path_requests_run functions to the "core" package
- Path_request class is updated to be more generic and reuseable
  in transmission main example. json processing linked to the yang
  modelling is kept in the pat_request_run, while path_request class
  only contains useful attributes
- adding functions in info and request python files in core directory

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-05 15:48:11 +01:00
Jean-Luc Auge
794e713d6d small fix source & node input in transmission main
enhance default value support
run global coronet network in the same US region

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 12:51:39 +02:00
Jean-Luc Auge
5e1fd7501e bug fix in amplifier extended gain range
NF_calc gain clamping to max gain in extended gain range

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 12:51:22 +02:00
Jean-Luc Auge
c86ea206d9 bug fix CORONET_GLOBAL_Topology example
bugs from the last refactor merge
- fix #1: update fiber length after split
- fix #2: add egress amp after fiber split

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 12:50:48 +02:00
Jean-Luc Auge
b810cf84c2 Power mode implementation
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-07-05 12:50:23 +02:00
EstherLerouzic
15bc5db2ed adding test on the propagation on a link example
test verifies that OSNR and SNR computed values are consistant
including connector loss and input power variation
onto 2 example links: 1 and 5 spans

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-04 16:45:28 +01:00
EstherLerouzic
4845d9005e Update tests after having inserted connector losses in json
previous json did not have connector losses input:
this commits updates the json expected files with the additional
connector losse in parameters .

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-03 12:52:50 +01:00
James Powell
bed4e9f1e1 clean up JSON comparator & remove redundant files 2018-07-02 17:48:57 -04:00
James Powell
7b20db10cc remove duplicate CORONET file 2018-07-02 16:38:43 -04:00
James Powell
d59b3e8c46 small test fixes 2018-07-02 16:36:10 -04:00
James Powell
8fccbb0ac2 rename test files 2018-07-02 16:28:06 -04:00
EstherLerouzic
f36a610e52 adding connector loss on the "loss" property of fiber class
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-02 18:30:40 +01:00
James Powell
479a2f358e clean-up test files/locations 2018-07-02 12:37:13 -04:00
EstherLerouzic
8795c357ae Implementation of connector losses in the propagation function
- connector losses can be part of the json description or not
  in any case a connector loss in and out is now part of the fiber class.
  if not present the default value is 0.0 dB

- propagate function in Fiber class now has a first attenuation
  propagation step. output conn loos is integrated in the
  initial loop function.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-07-02 17:23:58 +01:00
James Powell
d8cb7526bb remove redundant import 2018-07-02 12:10:14 -04:00
James
6ead8e391b Merge pull request #83 from berahtlv/losses
parameterized ROADM and FUSED losses
2018-06-27 17:19:00 -04:00
Roberts Miculens
362f45083d parametrized ROADM and FUSED losses 2018-06-27 18:16:18 +03:00
James Powell
36218037ec fix link 2018-06-22 19:06:10 -04:00
James
584b56bc83 Merge pull request #82 from dutc/fix-tests
small test fixes
2018-06-22 19:01:25 -04:00
James Powell
d33602e131 small test fixes 2018-06-22 18:55:26 -04:00
James
4304b49bf0 Merge pull request #81 from Orange-OpenSource/test_update
Test update
2018-06-22 11:24:44 -04:00
Jean-Luc Auge
4f5325acac small fixes
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-06-21 16:14:25 +02:00
EstherLerouzic
f8a40bfaf0 correct path
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-19 17:34:28 +01:00
EstherLerouzic
52dfb20a2b try with .travis.yml to set the path to examples
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-19 17:27:17 +01:00
EstherLerouzic
34b20cdfe0 adding exaples directory as a place to find programs
to test with travis CI

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-19 16:57:34 +01:00
EstherLerouzic
f462201499 Update of test for automatic testing
- Separating test python files for different purposes
      convert_file and convert_service_sheet now tested in a dedicated
      file testing parsers_test.py
- Correction of tests due to previous major refactor (especially introducing
      load_network)
- adding test files and references (expected json files)
- adding a test on service json file
- adapting the compare module to generated files

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-19 16:38:56 +01:00
EstherLerouzic
5106bdf634 Excel_userguide update + bugfixes
Excel_userguide has been updated with a description of how to use the service sheet
documentation on convert_service_sheet has been added

bugfix on convert_service_sheet to avoid multiple file writing

bugfix on create_eqpt_sheet.py to avoid the creation of a line when site is FUSED type

Completing README and adding templates for json files

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-15 13:54:06 +01:00
EstherLerouzic
b93c6dbcbd Merge: merge completed
Path_requests_run and convert_service_sheet.py now use funtions from transmission main

ISSUE from develop refactor : the json generation of the topology is missing all amplifiers !
- ILA are not included
- some edfa appear in the json topo when Eqpt sheet is used, but not all of them (only for
the requested path when running transmission_main_example)

TODO : identify if this is a new behaviour or if this is a regression

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 16:42:10 +01:00
EstherLerouzic
2ca141baba Merge work : Transceiver type added in equipment library
Transceiver type added to include this equipment in the building of the
library

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 15:52:55 +01:00
EstherLerouzic
01fe5d2147 path_requests_run now generates a json file if -o is used
the format of path results follows the yang model of ietf

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 11:39:28 +01:00
EstherLerouzic
74830cede4 path_requests_run.py now can use the route constraint
convert_services_sheet. py reads all columns and creat adhoc attributes.
Route constraint is saved in the "optimisation" field of the json format.
it is used as strict constraint for path computation. automatic decision
on node type is made based on the name.

Spacing, nb of channel and input power have been set in the requests json,
however this may not be inline with the ietf way: they should instead be
link attributes. However the objective of the program currently diverges
from this control oriented description , since requests explore whatif
scenarios for each request with its specific transponder type.

Introduction of loose case for the route constraint , input power,
spacing, nb of channel, per service entry

If no path exist to a loose node it is skipped. else a critical error is raised

Minor corrections on convert and path request
- convert: remove source and destination from intermediate list of nodes if
they are listed
- correction on the header of the sheet (column names)
- path-request: removing debug printings for a nice demo :)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 11:36:45 +01:00
EstherLerouzic
2f21cc29f7 Adding a verification on the trx_type entry of service sheet
verify that the trx_type is in the eqpt library
+ correction on the convert_service_sheet (use a list instead of multiple classes initialization)

Parametrisation of networkfile and spectril information in path_requests_run.py
addition of a second input file so that input can be with 2 excel files,
or with 2 json files , or a mix
utilisation of SI attribute in Eqpt_config.json for the input of spectral
information.

Implementation of transceiver_variety and parametrisation of eqpt_config.json
each request now reads the tranceiver type and convert it into
baudrate (assuming a mode).
eqpt_config file is now an argument of the path_requests_run.py

Adding additionnal columns in service sheet for mode, system and route

Transponder selection now includes the tsp modin the eqpt sheet now
also relies on mode name.
additional columns have been defined to input system values such as
channel spacing, input power, nb of channels. TODO :  This enables to compute
the spectral information vector according to the transponder features
for each service in the list...
2 optional columns to define the path as input (with a loose attribute):
if present the path must contain the ordered list of nodes to be crossed
TODO : use path input instead of the internal path computed

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 11:34:01 +01:00
EstherLerouzic
05f8d97d68 module convert_service_sheet.py and path_requests_run.py
Modules converts the service sheet into json file:
initial definition of classes.
Completing the json template of path computation request
Adding sync vector : but not in the correct form
should avoid multiple sync entries with same info

Treating multiple requests:
path_request_run.py allows computing snr from a list of requests
TODO: integrate into transmission_main_example and enable
feasiblility computation without path computation

Using transceiver class structure to export path computation feasibility
the output of the function now exports a list of tranceiver element,
each containing the result from spectral information propagation.

Adding transceiver type in eqpt library + correct form of xls sevice sheet
eqt library includes information abour transceivers and their modes
(baudrate, bitrate, OSNR threshold, and short name)
service sheet must not have annotation or comments outside of the useful row/column

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-06-13 11:30:23 +01:00
James
04c4795192 Merge pull request #77 from Orange-OpenSource/refactor_merge_2
Refactor merge 2
2018-06-07 11:56:57 -04:00
Jean-Luc Auge
ff6e379b6f Use load_json utility
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-06-07 11:56:20 +02:00
Jean-Luc Auge
25cfc375bc Amplifier selection
network design when no amplifiers are input in the json file
select_edfa chose the best amplifier in equipment list _from
eqpt_config.json_ wiht the following criteria in order:
1-amplifier wih sufficient gain
2-amplifier wiht sufficient power TODO
3-best NF amplifier staisfying 1 & 2

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-06-07 11:39:09 +02:00
Jean-Luc Auge
627184ef2d restore transmission main power sweep and format
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-06-07 10:59:33 +02:00
Jean-Luc Auge
180e1178ef restore xls to json file parser-execute from json
xls topology is parsed automatically to json
transmission_main reads from the json generated topology
transmission_main should not read from convert.py data output
This way there is consistency between json and xls inputs

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-06-07 09:36:07 +02:00
James
1c33454ce3 Merge pull request #76 from dutc/develop
Develop
2018-06-05 09:52:55 -04:00
James Powell
c7abe11dd8 Merge branch 'refactor_merge' into develop 2018-06-05 09:52:13 -04:00
James
7a6feccc8b Merge pull request #74 from Orange-OpenSource/refactor_merge
Refactor merge
2018-06-05 09:50:00 -04:00
James Powell
f05e578f51 small fixes 2018-06-05 09:46:48 -04:00
James
277ebea2b5 Merge pull request #75 from berahtlv/readme_links
fixed README broken hyperlinks
2018-06-05 09:35:24 -04:00
Roberts Miculens
4982c5e147 fixed README broken hyperlinks 2018-05-30 20:47:55 +03:00
Jean-Luc Auge
726e217205 merge debug
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-05-24 16:06:15 +02:00
James Powell
695c538ee5 merge of major refactor
cherry pick of the major refactor commit
many conflicts resolved and code re-written
2018-05-24 12:35:39 +02:00
James Powell
e9b287ceac small readme fix 2018-05-22 11:20:13 -04:00
James Powell
ecdae68b62 small readme fix 2018-05-22 11:10:06 -04:00
James Powell
e737564b32 add lat-long
jla cherry-picked in develop
2018-05-18 10:38:18 +02:00
James Powell
75c4a8d96d gnpy JSON comparison tool 2018-05-18 10:27:15 +02:00
James Powell
28dc1b050f remove redundant imports
jla cherry picked to merge with develop
2018-05-18 10:21:33 +02:00
James
4a9a7e31ba Merge pull request #71 from Orange-OpenSource/useability_improvement_3
Useability improvement 3
2018-05-03 11:33:48 -04:00
EstherLerouzic
92da31f905 file testing correction
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-03 15:25:45 +01:00
EstherLerouzic
afd264c816 new test files added
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-03 15:23:10 +01:00
EstherLerouzic
d676e3e217 removing a last config file
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-03 13:24:25 +01:00
EstherLerouzic
4af117aacf Simplification of tests
removing multiple excel files
adding a test amplifier in eqpt_library

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-03 13:20:40 +01:00
EstherLerouzic
b27e567a5e adding tests on excel parsint in amplifier_test.py
6 files are tested: if the converted json file constructed
based on the excel files provided in the test folder are not
identical to expected result, the test fails.
For future creation of test files:
excel name file should start with excelTest.
reference json file should be identical to the excelTest
eg excelTestFilefoo.xls -> testFilefoo.json

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-02 17:35:04 +01:00
EstherLerouzic
8ec9a5bf99 putting correct Path for input files in amplfier_test.py
json files paths should be indicated from gnpy/

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-02 14:50:30 +01:00
EstherLerouzic
f129f92cfd Enabling input files indirection for excel and json parser + pytest update
replacing bare string reading with Path object and split nam based on
Path structure in convert.py and in transmission_main_example.py

Update of pytest tests

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-05-02 14:28:49 +01:00
James
ef87373d4b Merge pull request #70 from Orange-OpenSource/useability_improvements_2
Useability improvements 2
2018-05-02 08:55:57 -04:00
James
ff938d4ced Merge pull request #66 from Orange-OpenSource/fused_spans
Fused spans, source/sink nodes suggestion, eqpt sheet support in xls parser
2018-05-02 08:55:38 -04:00
EstherLerouzic
ae555e0ce1 Formating Excel_userguide.rst and adding a section for the optional Eqpt sheet
Typos correction
Explanations added on how to fill in the excel Eqpt sheet
Removing verbose printing for debug of create_eqpt_sheet.py
Bug correction : blank removed when copying city names with creat_eqpt_sheet.py
Typos correction on README and Excel_userguide

adding some line for the export PYTHONPATH in the readme
typos corrections

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-04-30 15:37:46 +01:00
EstherLerouzic
44db6093f5 Adding a user guide to explain the excel input format
The README now points to a user guide file explaining the sheets and column format
the guide is not finished

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-04-30 15:04:01 +01:00
EstherLerouzic
d4bf6ce201 Program that creates the list of entries for the Eqpt xls sheet
The program reads the xls Nodes and Links Sheet and create the
mandatory city column as a text file.
If not present in Nodes the Type column is implicitely derived
from node degree (degree 2 means ILA, other means ROADM)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>

Text file path and name copied from excel input file for create_eqpt_sheet.py

input file name is used for the txt file creation
file is saved at the same place as the input file

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2018-04-30 15:02:08 +01:00
Jean-Luc Auge
0b1c7bfa30 Merge branch 'fused_spans-fixes' of https://github.com/dutc/gnpy into dutc-fused_spans
Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-27 12:17:41 +02:00
Jean-Luc Auge
80f8a345b5 Sanity checks and small code fix
xls convert:
	 sanity check Nodes vs Links sheets
	 column parsing limitation so user can add non code related
fields/information
code fix and eqpt_exists to check if an equipment is defined in the eqpt_config.json library

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-27 11:36:25 +02:00
James
c929ce0257 Merge pull request #69 from dutc/develop
quick demo section & command-line parameter for plotting in transmission_main_example.py
2018-04-26 12:20:25 -04:00
James Powell
7d3ab357d0 command-line parameter for plotting 2018-04-26 12:05:22 -04:00
James Powell
84d87602b8 quick demo section 2018-04-26 12:04:59 -04:00
James Powell
14938ea7dd small fixes 2018-04-25 02:42:08 -04:00
Jean-Luc Auge
be0c052e5e source/destination nodes suggestion
fix bug in Fused span: adding ingress and egress directions
=> do not share the same NE between 2 directions

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-24 15:16:20 +02:00
Jean-Luc Auge
af90a6658e Fused spans class and edfa gain adaptation
*new Fused spans class to support the connection between 2 fibers
without amplification
*add_egress_amplifier adjust the edfa gain to compensate the loss
of n fused spans
*eqpt not found in eqpt library is caught and stop the code

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

code fix for the eqpt sheet reading in xls parser

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>

amplifier design and eqpt parser code fix

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-24 15:15:59 +02:00
James
9720c979b3 fix sys.path for notebook (when running from repo or Binder) (#65) 2018-04-22 20:23:21 -04:00
James
7c86186416 Merge pull request #64 from dutc/develop
notebook version of transmission main example with Binder link
2018-04-22 20:12:24 -04:00
James Powell
404dbfe725 notebook version of transmission main example with Binder link 2018-04-22 20:10:09 -04:00
James Powell
02eac208dd Merge branch 'develop' 2018-04-22 19:11:42 -04:00
James Powell
bc9a0432cd update readme 2018-04-22 19:06:35 -04:00
James Powell
dba7846d6c equipment json paths use relative directories 2018-04-22 19:06:35 -04:00
Jean-Luc Auge
238d7723bb Read eqpt sheet in (convert.py) xls parser
*eqpt sheet is optional: no error message if missing
*All nodes description is not needed
  => only read the node eqpt config when amp_type is input
  => if amp_type = '', the node ingress or egress eqpt is ignored
  => the network.py add_egress_amplifier will fill in missing edfa

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
Jean-Luc Auge
d2724bb1a5 Code improvement in convert.py
improve code reading and clarity of topology parser:
*modify the connections binding
=>def fiber_dest_from_source to find all dest_city connected to a source_city
=>def fiber_link to return the fiber link from source_city to dest_city
*makes the parser algorithm more understandable
*prepare for the equipment parser

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
Jean-Luc Auge
8e046a9b50 Implement default values in xls parser for empty cells
*namedtuple default values are only applied if the cell is not read
*if the cell is read but is empty '', there is a need to provide default
values: which is what this fix does
*sanity checks are reinforced :
	-node_type is replaced by ROADM if degree <> 2: notify user
(print)
	-check that node type is in ('ILA','ROADM','FUSED')

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
Jean-Luc Auge
1b808afd5d Advanced xls input support: Nodes & Links reading
Parse advanced fiber parameters from xls into json format:
accept a .xls or a .json input parameter
creates the corresponding .json file
read east/west directions for fiber parameters (fber type, distance...)
read node type (ILA or ROADM)
provide default values if fields are missing
    => full backward compatibility to read the original CORONET xls

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
Jean-Luc Auge
8d6f69eb05 Equipment library implementation
-read eqpt_config.json with various EDFA and fiber type definitions
-std low gain and medium gain EDFA available
-clearer differentiation between the 2 edfa models:
	is the advanced_config_from_json field present or not?

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
Jean-Luc Auge
c4562df955 Add source sink arguments, restore convert.py
demonstrate mesh routing capability with source & sink options
readd CORONET xls and convert.py parser from xls to json

Signed-off-by: Jean-Luc Auge <jeanluc.auge@orange.com>
2018-04-22 19:06:35 -04:00
James Powell
8e1e8a8be3 re-add coronet files 2018-04-10 00:10:54 -04:00
Mattia Cantono
363e92e072 change RTD python version (#47) 2018-03-28 15:34:01 -04:00
James Powell
c5a52fdb6d small setup.py fixes 2018-03-22 13:17:06 -04:00
James Powell
2dd096cfab bump version to v0.1.2 2018-03-22 13:07:58 -04:00
James Powell
80e1ce12ce bump version to v0.1.1 2018-03-22 13:04:00 -04:00
James Powell
2777d957e4 add long_description_content_type 2018-03-22 13:00:47 -04:00
James Powell
898aa4f41a space align __str__ output 2018-03-20 22:44:32 -04:00
James Powell
c9ece6ad7c use scipy.constants 2018-03-20 22:44:17 -04:00
James Powell
a6265c1b8d document modules 2018-03-20 21:32:02 -04:00
James Powell
5646714c13 small setup.py fixes 2018-03-16 16:45:18 -04:00
James Powell
8ab13537a9 prettier printing 2018-03-15 19:09:29 -04:00
James Powell
25e14e4846 fixed __repr__
__repr__ "should look like a valid Python expression that could be used
to recreate an object with the same value" - https://docs.python.org/3/reference/datamodel.html#object.__repr__
2018-03-15 18:29:26 -04:00
James Powell
55d67f890d clean up examples/ directory 2018-03-15 15:52:39 -04:00
James Powell
9b3d7614f5 remove unused network files 2018-03-15 15:50:17 -04:00
James Powell
a3273d24b5 doc fixes 2018-03-15 14:46:24 -04:00
James Powell
ffd7bec485 docs fix 2018-03-15 14:42:36 -04:00
James Powell
0aef76407d small documentation fixes 2018-03-15 14:35:34 -04:00
James Powell
f2ad236863 fix authors 2018-03-15 14:26:16 -04:00
James Powell
487237638e Merge branch 'develop' 2018-03-15 14:25:26 -04:00
James Powell
3a78ccafce add AUTHORS.rst 2018-03-15 14:18:34 -04:00
150 changed files with 242333 additions and 12579 deletions

2
.bandit Normal file
View File

@@ -0,0 +1,2 @@
[bandit]
skips: B101

1
.codecov.yml Normal file
View File

@@ -0,0 +1 @@
comment: off

3
.docker-entry.sh Executable file
View File

@@ -0,0 +1,3 @@
#!/bin/bash
cp -nr /oopt-gnpy/gnpy/example-data /shared
exec "$@"

47
.docker-travis.sh Executable file
View File

@@ -0,0 +1,47 @@
#!/bin/bash
set -e
IMAGE_NAME=telecominfraproject/oopt-gnpy
IMAGE_TAG=$(git describe --tags)
ALREADY_FOUND=0
docker pull ${IMAGE_NAME}:${IMAGE_TAG} && ALREADY_FOUND=1
if [[ $ALREADY_FOUND == 0 ]]; then
docker build . -t ${IMAGE_NAME}
docker tag ${IMAGE_NAME} ${IMAGE_NAME}:${IMAGE_TAG}
# shared directory setup: do not clobber the real data
mkdir trash
cd trash
docker run -it --rm --volume $(pwd):/shared ${IMAGE_NAME} gnpy-transmission-example
else
echo "Image ${IMAGE_NAME}:${IMAGE_TAG} already available, will just update the other tags"
fi
docker images
do_docker_login() {
echo "${DOCKER_PASSWORD}" | docker login -u "${DOCKER_USERNAME}" --password-stdin
}
if [[ "${TRAVIS_PULL_REQUEST}" == "false" ]]; then
if [[ "${TRAVIS_BRANCH}" == "develop" || "${TRAVIS_BRANCH}" == "docker" ]]; then
echo "Publishing latest"
docker tag ${IMAGE_NAME}:${IMAGE_TAG} ${IMAGE_NAME}:latest
do_docker_login
if [[ $ALREADY_FOUND == 0 ]]; then
docker push ${IMAGE_NAME}:${IMAGE_TAG}
fi
docker push ${IMAGE_NAME}:latest
elif [[ "${TRAVIS_BRANCH}" == "master" ]]; then
echo "Publishing stable"
docker tag ${IMAGE_NAME}:${IMAGE_TAG} ${IMAGE_NAME}:stable
do_docker_login
if [[ $ALREADY_FOUND == 0 ]]; then
docker push ${IMAGE_NAME}:${IMAGE_TAG}
fi
docker push ${IMAGE_NAME}:stable
fi
fi

29
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,29 @@
---
name: Bug report
about: Create a report to help us improve
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment:**
- OS: [e.g. Windows]
- Python Version [e.g, 3.7]
- Anaconda Version [e.g. 3.7]
**Additional context**
Add any other context about the problem here.

View File

@@ -0,0 +1,17 @@
---
name: Feature request
about: Suggest an idea for this project
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

2
.gitignore vendored
View File

@@ -2,6 +2,8 @@
__pycache__/
*.py[cod]
*$py.class
.ipynb_checkpoints
.idea
# C extensions
*.so

5
.gitreview Normal file
View File

@@ -0,0 +1,5 @@
[gerrit]
host=review.gerrithub.io
project=Telecominfraproject/oopt-gnpy
defaultrebase=0
defaultbranch=develop

4
.readthedocs.yml Normal file
View File

@@ -0,0 +1,4 @@
build:
image: latest
python:
version: 3.6

View File

@@ -1,9 +1,24 @@
dist: xenial
sudo: false
language: python
services: docker
python:
- "3.6"
# command to install dependencies
install:
- pip install -r requirements.txt
# command to run tests
- "3.7"
install: skip
script:
- pytest
- python setup.py develop
- pip install pytest-cov rstcheck
- pytest --cov-report=xml --cov=gnpy -v
- rstcheck --ignore-roles cite *.rst
- sphinx-build -W --keep-going docs/ x-throwaway-location
after_success:
- bash <(curl -s https://codecov.io/bash)
jobs:
include:
- stage: test
name: Docker image
script:
- git fetch --unshallow
- ./.docker-travis.sh
- docker images

44
.zuul.yaml Normal file
View File

@@ -0,0 +1,44 @@
---
- project:
check:
jobs:
- tox-py36-cover
- coverage-diff:
voting: false
dependencies:
- tox-py36-cover-previous
- tox-py36-cover
vars:
coverage_job_name_previous: tox-py36-cover-previous
coverage_job_name_current: tox-py36-cover
- tox-linters-diff:
voting: false
- tox-docs-el8
- tox-py36-cover-previous
gate:
jobs:
- tox-py36-el8
- tox-docs-el8
tag:
jobs:
- oopt-release-python:
secrets:
- secret: pypi-oopt-gnpy
name: pypi_info
pass-to-parent: true
- secret:
name: pypi-oopt-gnpy
data:
username: __token__
password: !encrypted/pkcs1-oaep
- Taod9JmSMtVAvC5ShSbB3UWuccktQvutdySrj0G7a1Nk4tKFQIdwDXEnBuLpHsZVvsU9Q
6uk4wRVQABDSdNNI/+M/1FwmZfoxuOXa02U5S1deuxW/rBHTxzYcuB8xriwhArBvTiDMk
zyWHVysgDsjlR+85h/DkEhvsaMRDLYWqFwYgXizMoGNKVkwDVIH+qkhBmbggQfDpcYPKT
1gq0d6fw0eKVJtO8+vonMEcE0sWZvHmZvSSu0H++gxoe1W/JtzbCteH3Ak0zktwBHI8Qt
WBqFvY3laad335tpkFJN5b949N+DP8svCWwRwXmkZlHplPYZWF6QpYbEEXL/6Q0H6VwL+
om4f7ybYpKe9Gl939uv2INnXaKe5EU6CMsSw40r2XZCjnSTjWOTgh9pUn2PsoHnqUlALW
VR4Z+ipnCrEbu8aTmX3ROcnwYNS7OXkq4uhwDU1u9QjzyMHet6NQQhwhGtimsTo9KhL4E
TEUNiRlbAgow9WOwM5r3vRzddO8T2HZZSGaWj75qNRX46XPQWRWgB7ItAwyXgwLZ8UzWl
HdztjS3D7Hlsqno3zxNOVlhA5/vl9uVnhFbJnMtUOJAB07YoTJOeR+LjQ0avx/VzopxXc
RA/WvJXVZSBrlAHY0+ip4wPZvdi4Ph90gpmvHJvoH82KVfp2j5jxzUhsage94I=

29
AUTHORS.rst Normal file
View File

@@ -0,0 +1,29 @@
gnpy is written and maintained by the Telecom Infra Project with work
contributed by the following TIP members.
To learn how to contribute, please see CONTRIBUTING.md
(*in alphabetical order*)
- Alessio Ferrari (Politecnico di Torino) <alessio.ferrari@polito.it>
- Anders Lindgren (Telia Company) <Anders.X.Lindgren@teliacompany.com>
- Andrea D'Amico (Politecnico di Torino) <andrea.damico@polito.it>
- Brian Taylor (Facebook) <briantaylor@fb.com>
- David Boertjes (Ciena) <dboertje@ciena.com>
- Diego Landa (Facebook) <dlanda@fb.com>
- Esther Le Rouzic (Orange) <esther.lerouzic@orange.com>
- Gabriele Galimberti (Cisco) <ggalimbe@cisco.com>
- Gert Grammel (Juniper Networks) <ggrammel@juniper.net>
- Gilad Goldfarb (Facebook) <giladg@fb.com>
- James Powell (Telecom Infra Project) <james.powell@telecominfraproject.com>
- Jan Kundrát (Telecom Infra Project) <jan.kundrat@telecominfraproject.com>
- Jeanluc Augé (Orange) <jeanluc.auge@orange.com>
- Jonas Mårtensson (RISE) <jonas.martensson@ri.se>
- Mattia Cantono (Politecnico di Torino) <mattia.cantono@polito.it>
- Miguel Garrich (University Catalunya) <miquel.garrich@upct.es>
- Raj Nagarajan (Lumentum) <raj.nagarajan@lumentum.com>
- Roberts Miculens (Lattelecom) <roberts.miculens@lattelecom.lv>
- Shengxiang Zhu (University of Arizona) <szhu@email.arizona.edu>
- Stefan Melin (Telia Company) <Stefan.Melin@teliacompany.com>
- Vittorio Curri (Politecnico di Torino) <vittorio.curri@polito.it>
- Xufeng Liu (Jabil) <xufeng_liu@jabil.com>

View File

@@ -1,17 +0,0 @@
Contributors in alphabetical order
==================================
Name | Surname | Affiliation | Contact
-----|---------|-------------|--------
Alessio | Ferrari | Politecnico di Torino | alessio.ferrari@polito.it
Brian | Taylor | Facebook | briantaylor@fb.com
David | Boertjes | Ciena | dboertje@ciena.com
Esther | Le Rouzic | Orange | esther.lerouzic@orange.com
Gabriele | Galimberti | Cisco | ggalimbe@cisco.com
Gert | Grammel | Juniper Networks | ggrammel@juniper.net
Gilad | Goldfarb | Facebook | giladg@fb.com
James | Powell | Consultant | james@dontusethiscode.com
Jeanluc | Auge | Orange | jeanluc.auge@orange.com
Liu | Xufeng | Jabil | Xufeng_Liu@jabil.com
Mattia | Cantono | Politecnico di Torino | mattia.cantono@polito.it
Vittorio | Curri | Politecnico di Torino | vittorio.curri@polito.it

8
Dockerfile Normal file
View File

@@ -0,0 +1,8 @@
FROM python:3.7-slim
COPY . /oopt-gnpy
WORKDIR /oopt-gnpy
RUN apt update; apt install -y git
RUN pip install .
WORKDIR /shared/example-data
ENTRYPOINT ["/oopt-gnpy/.docker-entry.sh"]
CMD ["/bin/bash"]

View File

@@ -1,15 +1,21 @@
====
.. image:: docs/images/GNPy-banner.png
:width: 100%
:align: left
:alt: GNPy with an OLS system
====================================================================
`gnpy`: mesh optical network route planning and optimization library
====
====================================================================
|docs| |build|
|docs| |travis| |doi| |contributors| |codacy-quality| |codecov|
**gnpy is an open-source, community-developed library for building route planning
and optimization tools in real-world mesh optical networks.**
**`gnpy` is an open-source, community-developed library for building route
planning and optimization tools in real-world mesh optical networks.**
`gnpy <http://github.com/telecominfraproject/gnpy>`_ is:
`gnpy <http://github.com/telecominfraproject/oopt-gnpy>`__ is:
--------------------------------------------------------------
- a sponsored project of the `OOPT/PSE <http://telecominfraproject.com/project-groups-2/backhaul-projects/open-optical-packet-transport/>`_ working group of the `Telecom Infra Project <http://telecominfraproject.com>`_.
- a sponsored project of the `OOPT/PSE <https://telecominfraproject.com/open-optical-packet-transport/>`_ working group of the `Telecom Infra Project <http://telecominfraproject.com>`_
- fully community-driven, fully open source library
- driven by a consortium of operators, vendors, and academic researchers
- intended for rapid development of production-grade route planning tools
@@ -18,63 +24,67 @@ and optimization tools in real-world mesh optical networks.**
Documentation: https://gnpy.readthedocs.io
Installation
------------
Get In Touch
~~~~~~~~~~~~
``gnpy`` is hosted in the `Python Package Index <http://pypi.org/>`_ (`gnpy <https://pypi.org/project/gnpy/>`_). It can be installed via:
There are `weekly calls <https://telecominfraproject.workplace.com/events/702894886867547/>`__ about our progress.
Newcomers, users and telecom operators are especially welcome there.
We encourage all interested people outside the TIP to `join the project <https://telecominfraproject.com/apply-for-membership/>`__.
.. code-block:: shell
Branches and Tagged Releases
----------------------------
$ pip install gnpy
- all releases are `available via GitHub <https://github.com/Telecominfraproject/oopt-gnpy/releases>`_
- the `master <https://github.com/Telecominfraproject/oopt-gnpy/tree/master>`_ branch contains stable, `validated code <https://github.com/Telecominfraproject/oopt-gnpy/wiki/Testing-for-Quality>`_. It is updated from develop on a release schedule determined by the OOPT-PSE Working Group.
- the `develop <https://github.com/Telecominfraproject/oopt-gnpy/tree/develop>`_ branch contains the latest code under active development, which may not be fully validated and tested.
It can also be installed directly from the repo.
How to Install
--------------
.. code-block:: shell
Install either via `Docker <docs/install.rst#install-docker>`__, or as a `Python package <docs/install.rst#install-pip>`__.
$ git clone https://github.com/telecominfraproject/gnpy
$ cd gnpy
$ python setup.py install
Both approaches above will handle installing any additional software dependencies.
**Note**: *We recommend the use of the Anaconda Python distribution
(https://www.anaconda.com/download) which comes with many scientific
computing dependencies pre-installed.*
Instructions for Use
--------------------
Instructions for First Use
--------------------------
``gnpy`` is a library for building route planning and optimization tools.
It ships with a number of example programs. Release versions will ship with
fully-functional programs.
**Note**: *If you are a network operator or involved in route planning and
optimization for your organization, please contact project maintainer James
Powell <james.powell@telecominfraproject>. gnpy is looking for users with
optimization for your organization, please contact project maintainer Jan
Kundrát <jan.kundrat@telecominfraproject.com>. gnpy is looking for users with
specific, delineated use cases to drive requirements for future
development.*
This example demonstrates how GNPy can be used to check the expected SNR at the end of the line by varying the channel input power:
**To get started, run the transmission example:**
.. image:: https://telecominfraproject.github.io/oopt-gnpy/docs/images/transmission_main_example.svg
:width: 100%
:align: left
:alt: Running a simple simulation example
:target: https://asciinema.org/a/252295
.. code-block:: shell
$ python examples/transmission_main_example.py
By default, this script operates on a single span network defined in `examples/edfa/edfa_example_network.json <examples/edfa/edfa_example_network.json>`_
By default, this script operates on a single span network defined in
`gnpy/example-data/edfa_example_network.json <gnpy/example-data/edfa_example_network.json>`_
You can specify a different network at the command line as follows. For
example, to use the CORONET Continental US (CONUS) network defined in `examples/coronet_conus_example.json <examples/coronet_conus_example.json>`_:
example, to use the CORONET Global network defined in
`gnpy/example-data/CORONET_Global_Topology.json <gnpy/example-data/CORONET_Global_Topology.json>`_:
.. code-block:: shell
.. code-block:: shell-session
$ python examples/transmission_main_example.py examples/coronet_conus_example.json
$ gnpy-transmission-example $(gnpy-example-data)/CORONET_Global_Topology.json
This script will calculate the average signal osnr and snr across 93 network
elements (transceiver, ROADMs, fibers, and amplifiers) between Abilene, Texas
and Albany, New York.
It is also possible to use an Excel file input (for example
`gnpy/example-data/CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_).
The Excel file will be processed into a JSON file with the same prefix.
Further details about the Excel data structure are available `in the documentation <docs/excel.rst>`__.
The main transmission example will calculate the average signal OSNR and SNR
across network elements (transceiver, ROADMs, fibers, and amplifiers)
between two transceivers selected by the user. Additional details are provided by doing ``gnpy-transmission-example -h``. (By default, for the CORONET Global
network, it will show the transmission of spectral information between Abilene and Albany)
This script calculates the average signal OSNR = |OSNR| and SNR = |SNR|.
@@ -87,16 +97,55 @@ interference noise.
.. |Pase| replace:: P\ :sub:`ase`
.. |Pnli| replace:: P\ :sub:`nli`
The `transmission_main_example.py <examples/transmission_main_example.py>`_
script propagates a specrum of 96 channels at 32 Gbaud, 50 GHz spacing and 0
dBm/channel. These are not yet parametrized but can be modified directly in the
script (via the SpectralInformation tuple) to accomodate any baud rate,
spacing, power or channel count demand.
Further Instructions for Use
----------------------------
Simulations are driven by a set of `JSON <docs/json.rst>`__ or `XLS <docs/excel.rst>`__ files.
The ``gnpy-transmission-example`` script propagates a spectrum of channels at 32 Gbaud, 50 GHz spacing and 0 dBm/channel.
Launch power can be overridden by using the ``--power`` argument.
Spectrum information is not yet parametrized but can be modified directly in the ``eqpt_config.json`` (via the ``SpectralInformation`` -SI- structure) to accommodate any baud rate or spacing.
The number of channel is computed based on ``spacing`` and ``f_min``, ``f_max`` values.
An experimental support for Raman amplification is available:
.. code-block:: shell-session
$ gnpy-transmission-example \
$(gnpy-example-data)/raman_edfa_example_network.json \
--sim $(gnpy-example-data)/sim_params.json --show-channels
Configuration of Raman pumps (their frequencies, power and pumping direction) is done via the `RamanFiber element in the network topology <gnpy/example-data/raman_edfa_example_network.json>`_.
General numeric parameters for simulaiton control are provided in the `gnpy/example-data/sim_params.json <gnpy/example-data/sim_params.json>`_.
Use ``gnpy-path-request`` to run multiple optimizations as follows:
.. code-block:: shell-session
$ gnpy-path-request -h
Usage: gnpy-path-requests [-h] [-v] [-o OUTPUT] [network_filename] [service_filename] [eqpt_filename]
The ``network_filename`` and ``service_filename`` can be an XLS or JSON file. The ``eqpt_filename`` must be a JSON file.
To see an example of it, run:
.. code-block:: shell-session
$ cd $(gnpy-example-data)
$ gnpy-path-request meshTopologyExampleV2.xls meshTopologyExampleV2_services.json eqpt_config.json -o output_file.json
This program requires a list of connections to be estimated and the equipment
library. The program computes performances for the list of services (accepts
`JSON <docs/json.rst>`__ or `Excel <docs/excel.rst>`__ format) using the same spectrum propagation modules as
``gnpy-transmission-example``.
The output format is based on `draft-ietf-teas-yang-path-computation-01 <https://tools.ietf.org/html/draft-ietf-teas-yang-path-computation-01>`_ with custom extensions (e.g., for transponder modes).
An example of the JSON input is provided in file `service-template.json`, while results are shown in `path_result_template.json`.
Important note: ``gnpy-path-request`` is not a network dimensionning tool: each service does not reserve spectrum, or occupy ressources such as transponders. It only computes path feasibility assuming the spectrum (between defined frequencies) is loaded with "nb of channels" spaced by "spacing" values as specified in the system parameters input in the service file, each cannel having the same characteristics in terms of baudrate, format,... as the service transponder. The transceiver element acts as a "logical starting/stopping point" for the spectral information propagation. At that point it is not meant to represent the capacity of add drop ports.
As a result transponder type is not part of the network info. it is related to the list of services requests.
The current version includes a spectrum assigment features that enables to compute a candidate spectrum assignment for each service based on a first fit policy. Spectrum is assigned based on service specified spacing value, path_bandwidth value and selected mode for the transceiver. This spectrum assignment includes a basic capacity planning capability so that the spectrum resource is limited by the frequency min and max values defined for the links. If the requested services reach the link spectrum capacity, additional services feasibility are computed but marked as blocked due to spectrum reason.
The amplifier's gain is set to exactly compsenate for the loss in each network
element. The amplifier is currently defined with gain range of 15 dB to 25 dB
and 21 dBm max output power. Ripple and NF models are defined in
`examples/edfa_config.json <examples/edfa_config.json>`_
Contributing
------------
@@ -104,17 +153,17 @@ Contributing
``gnpy`` is looking for additional contributors, especially those with experience
planning and maintaining large-scale, real-world mesh optical networks.
To get involved, please contact James Powell
<james.powell@telecominfraproject.com> or Gert Grammel <ggrammel@juniper.net>.
To get involved, please contact Jan Kundrát
<jan.kundrat@telecominfraproject.com> or Gert Grammel <ggrammel@juniper.net>.
``gnpy`` contributions are currently limited to members of `TIP
<http://telecominfraproject.com>`_. Membership is free and open to all.
See the `Onboarding Guide
<https://github.com/Telecominfraproject/gnpy/wiki/Onboarding-Guide>`_ for
specific details on code contribtions.
specific details on code contributions.
See `AUTHORS.Md <AUTHORS.Md>`_ for past and present contributors.
See `AUTHORS.rst <AUTHORS.rst>`_ for past and present contributors.
Project Background
------------------
@@ -122,7 +171,7 @@ Project Background
Data Centers are built upon interchangeable, highly standardized node and
network architectures rather than a sum of isolated solutions. This also
translates to optical networking. It leads to a push in enabling multi-vendor
optical network by disaggregating HW and SW functions and focussing on
optical network by disaggregating HW and SW functions and focusing on
interoperability. In this paradigm, the burden of responsibility for ensuring
the performance of such disaggregated open optical systems falls on the
operators. Consequently, operators and vendors are collaborating in defining
@@ -153,14 +202,34 @@ working group set out to disrupt the planning landscape by providing an open
source simulation model which can be used freely across multiple vendor
implementations.
.. |docs| image:: https://readthedocs.org/projects/gnpy/badge/?version=develop
:target: http://gnpy.readthedocs.io/en/develop/?badge=develop
.. |docs| image:: https://readthedocs.org/projects/gnpy/badge/?version=master
:target: http://gnpy.readthedocs.io/en/master/?badge=master
:alt: Documentation Status
:scale: 100%
.. |build| image:: https://travis-ci.org/mcantono/gnpy.svg?branch=develop
:target: https://travis-ci.org/mcantono/gnpy
:alt: Build Status
.. |travis| image:: https://travis-ci.com/Telecominfraproject/oopt-gnpy.svg?branch=master
:target: https://travis-ci.com/Telecominfraproject/oopt-gnpy
:alt: Build Status via Travis CI
:scale: 100%
.. |doi| image:: https://zenodo.org/badge/96894149.svg
:target: https://zenodo.org/badge/latestdoi/96894149
:alt: DOI
:scale: 100%
.. |contributors| image:: https://img.shields.io/github/contributors-anon/Telecominfraproject/oopt-gnpy
:target: https://github.com/Telecominfraproject/oopt-gnpy/graphs/contributors
:alt: Code Contributors via GitHub
:scale: 100%
.. |codacy-quality| image:: https://img.shields.io/lgtm/grade/python/github/Telecominfraproject/oopt-gnpy
:target: https://lgtm.com/projects/g/Telecominfraproject/oopt-gnpy/
:alt: Code Quality via LGTM.com
:scale: 100%
.. |codecov| image:: https://img.shields.io/codecov/c/github/Telecominfraproject/oopt-gnpy
:target: https://codecov.io/gh/Telecominfraproject/oopt-gnpy
:alt: Code Coverage via codecov
:scale: 100%
TIP OOPT/PSE & PSE WG Charter
@@ -187,5 +256,4 @@ License
``gnpy`` is distributed under a standard BSD 3-Clause License.
See `LICENSE <LICENSE>`_ for more details.
See `LICENSE <LICENSE>`__ for more details.

View File

@@ -1,17 +0,0 @@
Contributors in alphabetical order
==================================
Name | Surname | Affiliation | Contact
-----|---------|-------------|--------
Alessio | Ferrari | Politecnico di Torino | alessio.ferrari@polito.it
Brian | Taylor | Facebook | briantaylor@fb.com
David | Boertjes | Ciena | dboertje@ciena.com
Esther | Le Rouzic | Orange | esther.lerouzic@orange.com
Gabriele | Galimberti | Cisco | ggalimbe@cisco.com
Gert | Grammel | Juniper Networks | ggrammel@juniper.net
Gilad | Goldfarb | Facebook | giladg@fb.com
James | Powell | Consultant | james@dontusethiscode.com
Jeanluc | Auge | Orange | jeanluc.auge@orange.com
Liu | Xufeng | Jabil | Xufeng_Liu@jabil.com
Mattia | Cantono | Politecnico di Torino | mattia.cantono@polito.it
Vittorio | Curri | Politecnico di Torino | vittorio.curri@polito.it

View File

@@ -874,7 +874,7 @@ month={Sept},}
number = {7},
journal = {Optics Express},
urlyear = {2017-11-14},
year = {2012-03-26},
date = {2012-03-26},
year = {2012},
pages = {7777},
author = {Bononi, A. and Serena, P. and Rossi, N. and Grellier, E. and Vacondio, F.}
@@ -1114,7 +1114,7 @@ month={Sept},}
number = {26},
journal = {Optics Express},
urlyear = {2017-11-16},
year = {2013-12-30},
date = {2013-12-30},
year = {2013},
pages = {32254},
author = {Bononi, Alberto and Beucher, Ottmar and Serena, Paolo}

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# GNpy documentation build configuration file, created by
# gnpy documentation build configuration file, created by
# sphinx-quickstart on Mon Dec 18 14:41:01 2017.
#
# This file is execfile()d with the current directory set to its
@@ -32,7 +32,9 @@ sys.path.insert(0, os.path.abspath('../'))
# ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.mathjax',
'sphinx.ext.githubpages','sphinxcontrib.bibtex']
'sphinx.ext.githubpages',
'sphinxcontrib.bibtex',
'pbr.sphinxext',]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -47,19 +49,10 @@ source_suffix = ['.rst', '.md']
master_doc = 'index'
# General information about the project.
project = 'GNpy'
copyright = '2017, Telecom InfraProject - OOPT PSE Group'
project = 'gnpy'
copyright = '2018, Telecom InfraProject - OOPT PSE Group'
author = 'Telecom InfraProject - OOPT PSE Group'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '0.1'
# The full version, including alpha/beta/rc tags.
release = '0.1'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
@@ -87,8 +80,17 @@ todo_include_todos = False
on_rtd = os.environ.get('READTHEDOCS') == 'True'
if on_rtd:
html_theme = 'default'
html_theme_options = {
'logo_only': True,
}
else:
html_theme = 'alabaster'
html_theme_options = {
'logo': 'images/GNPy-logo.png',
'logo_name': False,
}
html_logo = 'images/GNPy-logo.png'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
@@ -99,7 +101,7 @@ else:
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
html_static_path = []
# Custom sidebar templates, must be a dictionary that maps document names
# to template names.
@@ -120,7 +122,7 @@ html_sidebars = {
# -- Options for HTMLHelp output ------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'GNpydoc'
htmlhelp_basename = 'gnpydoc'
# -- Options for LaTeX output ---------------------------------------------
@@ -147,7 +149,7 @@ latex_elements = {
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'GNpy.tex', 'GNpy Documentation',
(master_doc, 'gnpy.tex', 'gnpy Documentation',
'Telecom InfraProject - OOPT PSE Group', 'manual'),
]
@@ -157,7 +159,7 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'gnpy', 'GNpy Documentation',
(master_doc, 'gnpy', 'gnpy Documentation',
[author], 1)
]
@@ -168,10 +170,14 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'GNpy', 'GNpy Documentation',
author, 'GNpy', 'One line description of project.',
(master_doc, 'gnpy', 'gnpy Documentation',
author, 'gnpy', 'One line description of project.',
'Miscellaneous'),
]
autodoc_default_options = {
'members': True,
'undoc-members': True,
'private-members': True,
'show-inheritance': True,
}

228
docs/excel.rst Normal file
View File

@@ -0,0 +1,228 @@
Excel (XLS, XLSX) input files
=============================
``gnpy-transmission-example`` gives the possibility to use an excel input file instead of a json file. The program then will generate the corresponding json file for you.
The file named 'meshTopologyExampleV2.xls' is an example.
In order to work the excel file MUST contain at least 2 sheets:
- Nodes
- Links
(In progress) The File MAY contain an additional sheet:
- Eqt
- Service
.. _excel-nodes-sheet:
Nodes sheet
-----------
Nodes sheet contains nine columns.
Each line represents a 'node' (ROADM site or an in line amplifier site ILA or a Fused)::
City (Mandatory) ; State ; Country ; Region ; Latitude ; Longitude ; Type
- **City** is used for the name of a node of the graph. It accepts letters, numbers,underscore,dash, blank... (not exhaustive). The user may want to avoid commas for future CSV exports.
**City name MUST be unique**
- **Type** is not mandatory.
- If not filled, it will be interpreted as an 'ILA' site if node degree is 2 and as a ROADM otherwise.
- If filled, it can take "ROADM", "FUSED" or "ILA" values. If another string is used, it will be considered as not filled. FUSED means that ingress and egress spans will be fused together.
- *State*, *Country*, *Region* are not mandatory.
"Region" is a holdover from the CORONET topology reference file `CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_. CORONET separates its network into geographical regions (Europe, Asia, Continental US.) This information is not used by gnpy.
- *Longitude*, *Latitude* are not mandatory. If filled they should contain numbers.
- **Booster_restriction** and **Preamp_restriction** are not mandatory.
If used, they must contain one or several amplifier type_variety names separated by ' | '. This information is used to restrict types of amplifiers used in a ROADM node during autodesign. If a ROADM booster or preamp is already specified in the Eqpt sheet , the field is ignored. The field is also ignored if the node is not a ROADM node.
**There MUST NOT be empty line(s) between two nodes lines**
.. _excel-links-sheet:
Links sheet
-----------
Links sheet must contain sixteen columns::
<-- east cable from a to z --> <-- west from z to -->
NodeA ; NodeZ ; Distance km ; Fiber type ; Lineic att ; Con_in ; Con_out ; PMD ; Cable Id ; Distance km ; Fiber type ; Lineic att ; Con_in ; Con_out ; PMD ; Cable Id
Links sheets MUST contain all links between nodes defined in Nodes sheet.
Each line represents a 'bidir link' between two nodes. The two directions are represented on a single line with "east cable from a to z" fields and "west from z to a" fields. Values for 'a to z' may be different from values from 'z to a'.
Since both direction of a bidir 'a-z' link are described on the same line (east and west), 'z to a' direction MUST NOT be repeated in a different line. If repeated, it will generate another parrallel bidir link between the same end nodes.
Parameters for "east cable from a to z" and "west from z to a" are detailed in 2x7 columns. If not filled, "west from z to a" is copied from "east cable from a to z".
For example, a line filled with::
node6 ; node3 ; 80 ; SSMF ; 0.2 ; 0.5 ; 0.5 ; 0.1 ; cableB ; ; ; 0.21 ; 0.2 ; ; ;
will generate a unidir fiber span from node6 to node3 with::
[node6 node3 80 SSMF 0.2 0.5 0.5 0.1 cableB]
and a fiber span from node3 to node6::
[node6 node3 80 SSMF 0.21 0.2 0.5 0.1 cableB] attributes.
- **NodeA** and **NodeZ** are Mandatory.
They are the two endpoints of the link. They MUST contain a node name from the **City** names listed in Nodes sheet.
- **Distance km** is not mandatory.
It is the link length.
- If filled it MUST contain numbers. If empty it is replaced by a default "80" km value.
- If value is below 150 km, it is considered as a single (bidirectional) fiber span.
- If value is over 150 km the `gnpy-transmission-example`` program will automatically suppose that intermediate span description are required and will generate fiber spans elements with "_1","_2", ... trailing strings which are not visible in the json output. The reason for the splitting is that current edfa usually do not support large span loss. The current assumption is that links larger than 150km will require intermediate amplification. This value will be revisited when Raman amplification is added”
- **Fiber type** is not mandatory.
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Fiber" list "type_variety".
If not filled it takes "SSMF" as default value.
- **Lineic att** is not mandatory.
It is the lineic attenuation expressed in dB/km.
If filled it must contain positive numbers.
If not filled it takes "0.2" dB/km value
- *Con_in*, *Con_out* are not mandatory.
They are the connector loss in dB at ingress and egress of the fiber spans.
If filled they must contain positive numbers.
If not filled they take "0.5" dB default value.
- *PMD* is not mandatory and and is not used yet.
It is the PMD value of the link in ps.
If filled they must contain positive numbers.
If not filled, it takes "0.1" ps value.
- *Cable Id* is not mandatory.
If filled they must contain strings with the same constraint as "City" names. Its value is used to differenate links having the same end points. In this case different Id should be used. Cable Ids are not meant to be unique in general.
(in progress)
.. _excel-equipment-sheet:
Eqpt sheet
----------
Eqt sheet is optional. It lists the amplifiers types and characteristics on each degree of the *Node A* line.
Eqpt sheet must contain twelve columns::
<-- east cable from a to z --> <-- west from z to a -->
Node A ; Node Z ; amp type ; att_in ; amp gain ; tilt ; att_out ; delta_p ; amp type ; att_in ; amp gain ; tilt ; att_out ; delta_p
If the sheet is present, it MUST have as many lines as egress directions of ROADMs defined in Links Sheet.
For example, consider the following list of links (A,B and C being a ROADM and amp# ILAs)
::
A - amp1
amp1 - amp2
Amp2 - B
A - amp3
amp3 - C
then Eqpt sheet should contain:
- one line for each ILAs: amp1, amp2, amp3
- one line for each degree 1 ROADMs B and C
- two lines for ROADM A which is a degree 2 ROADM
::
A - amp1
amp1 - amp2
Amp2 - B
A - amp3
amp3 - C
B - amp2
C - amp3
In case you already have filled Nodes and Links sheets `create_eqpt_sheet.py <gnpy/example-data/create_eqpt_sheet.py>`_ can be used to automatically create a template for the mandatory entries of the list.
.. code-block:: shell
$ cd $(gnpy-example-data)
$ python create_eqpt_sheet.py meshTopologyExampleV2.xls
This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content can be directly copied into the Eqt sheet of the excel file. The user then can fill the values in the rest of the columns.
- **Node A** is mandatory. It is the name of the node (as listed in Nodes sheet).
If Node A is a 'ROADM' (Type attribute in sheet Node), its number of occurence must be equal to its degree.
If Node A is an 'ILA' it should appear only once.
- **Node Z** is mandatory. It is the egress direction from the *Node A* site. Multiple Links between the same Node A and NodeZ is not supported.
- **amp type** is not mandatory.
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Edfa" list "type_variety".
If not filled it takes "std_medium_gain" as default value.
If filled with fused, a fused element with 0.0 dB loss will be placed instead of an amplifier. This might be used to avoid booster amplifier on a ROADM direction.
- **amp_gain** is not mandatory. It is the value to be set on the amplifier (in dB).
If not filled, it will be determined with design rules in the convert.py file.
If filled, it must contain positive numbers.
- *att_in* and *att_out* are not mandatory and are not used yet. They are the value of the attenuator at input and output of amplifier (in dB).
If filled they must contain positive numbers.
- *tilt* --TODO--
- **delta_p**, in dBm, is not mandatory. If filled it is used to set the output target power per channel at the output of the amplifier, if power_mode is True. The output power is then set to power_dbm + delta_power.
# to be completed #
(in progress)
.. _excel-service-sheet:
Service sheet
-------------
Service sheet is optional. It lists the services for which path and feasibility must be computed with ``gnpy-path_request``.
Service sheet must contain 11 columns::
route id ; Source ; Destination ; TRX type ; Mode ; System: spacing ; System: input power (dBm) ; System: nb of channels ; routing: disjoint from ; routing: path ; routing: is loose?
- **route id** is mandatory. It must be unique. It is the identifier of the request. It can be an integer or a string (do not use blank or dash or coma)
- **Source** is mandatory. It is the name of the source node (as listed in Nodes sheet). Source MUST be a ROADM node. (TODO: relax this and accept trx entries)
- **Destination** is mandatory. It is the name of the destination node (as listed in Nodes sheet). Source MUST be a ROADM node. (TODO: relax this and accept trx entries)
- **TRX type** is mandatory. They are the variety type and selected mode of the transceiver to be used for the propagation simulation. These modes MUST be defined in the equipment library. The format of the mode is used as the name of the mode. (TODO: maybe add another mode id on Transceiver library ?). In particular the mode selection defines the channel baudrate to be used for the propagation simulation.
- **mode** is optional. If not specified, the program will search for the mode of the defined transponder with the highest baudrate fitting within the spacing value.
- **System: spacing** is mandatory. Spacing is the channel spacing defined in GHz difined for the feasibility propagation simulation, assuming system full load.
- **System: input power (dBm) ; System: nb of channels** are optional input defining the system parameters for the propagation simulation.
- input power is the channel optical input power in dBm
- nb of channels is the number of channels to be used for the simulation.
- **routing: disjoint from ; routing: path ; routing: is loose?** are optional.
- disjoint from: identifies the requests from which this request must be disjoint. If filled it must contain request ids separated by ' | '
- path: is the set of ROADM nodes that must be used by the path. It must contain the list of ROADM names that the path must cross. TODO : only ROADM nodes are accepted in this release. Relax this with any type of nodes. If filled it must contain ROADM ids separated by ' | '. Exact names are required.
- is loose? 'no' value means that the list of nodes should be strictly followed, while any other value means that the constraint may be relaxed if the node is not reachable.
- **path bandwidth** is mandatory. It is the amount of capacity required between source and destination in Gbit/s. Value should be positive (non zero). It is used to compute the amount of required spectrum for the service.

13
docs/gnpy-api-core.rst Normal file
View File

@@ -0,0 +1,13 @@
``gnpy.core``
-------------
.. automodule:: gnpy.core
.. automodule:: gnpy.core.ansi_escapes
.. automodule:: gnpy.core.elements
.. automodule:: gnpy.core.equipment
.. automodule:: gnpy.core.exceptions
.. automodule:: gnpy.core.info
.. automodule:: gnpy.core.network
.. automodule:: gnpy.core.parameters
.. automodule:: gnpy.core.science_utils
.. automodule:: gnpy.core.utils

9
docs/gnpy-api-tools.rst Normal file
View File

@@ -0,0 +1,9 @@
``gnpy.tools``
--------------
.. automodule:: gnpy.tools
.. automodule:: gnpy.tools.cli_examples
.. automodule:: gnpy.tools.convert
.. automodule:: gnpy.tools.json_io
.. automodule:: gnpy.tools.plots
.. automodule:: gnpy.tools.service_sheet

View File

@@ -0,0 +1,6 @@
``gnpy.topology``
-----------------
.. automodule:: gnpy.topology
.. automodule:: gnpy.topology.request
.. automodule:: gnpy.topology.spectrum_assignment

14
docs/gnpy-api.rst Normal file
View File

@@ -0,0 +1,14 @@
***************************
API Reference Documentation
***************************
``gnpy`` package
================
.. automodule:: gnpy
.. toctree::
gnpy-api-core
gnpy-api-topology
gnpy-api-tools

BIN
docs/images/GNPy-banner.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 518 KiB

BIN
docs/images/GNPy-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

View File

@@ -1,37 +1,18 @@
.. GNpy documentation master file, created by
sphinx-quickstart on Mon Dec 18 14:41:01 2017.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
GNPy: Optical Route Planning Library
=====================================================================
Welcome to GNpy's documentation!
================================
Gaussian Noise (GN) based modeling library for physical layer impairment evaluation in optical networks.
Summary
--------
We believe that openly sharing ideas, specifications, and other intellectual property is the key to maximizing innovation and reducing complexity
PSE WG Charter
--------------
- Goal is to build an end-to-end simulation environment which defines the network models of the optical device transfer functions and their parameters. This environment will provide validation of the optical performance requirements for the TIP OLS building blocks.
- The model may be approximate or complete depending on the network complexity. Each model shall be validated against the proposed network scenario.
- The environment must be able to process network models from multiple vendors, and also allow users to pick any implementation in an open source framework.
- The PSE will influence and benefit from the innovation of the DTC, API, and OLS working groups.
- The PSE represents a step along the journey towards multi-layer optimization.
Documentation
=============
The following pages are meant to describe specific implementation details and modeling assumptions behind GNpy.
`GNPy <http://github.com/telecominfraproject/gnpy>`_ is an open-source,
community-developed library for building route planning and optimization tools
in real-world mesh optical networks. It is based on the Gaussian Noise Model.
.. toctree::
:maxdepth: 2
:maxdepth: 4
gn_model
install
json
excel
model
gnpy-api
Indices and tables
==================
@@ -40,31 +21,3 @@ Indices and tables
* :ref:`modindex`
* :ref:`search`
Contributors in alphabetical order
==================================
+----------+------------+-----------------------+----------------------------+
| Name | Surname | Affiliation | Contact |
+==========+============+=======================+============================+
| Alessio | Ferrari | Politecnico di Torino | alessio.ferrari@polito.it |
+----------+------------+-----------------------+----------------------------+
| Brian | Taylor | Facebook | briantaylor@fb.com |
+----------+------------+-----------------------+----------------------------+
| David | Boertjes | Ciena | dboertje@ciena.com |
+----------+------------+-----------------------+----------------------------+
| Esther | Le Rouzic | Orange | esther.lerouzic@orange.com |
+----------+------------+-----------------------+----------------------------+
| Gabriele | Galimberti | Cisco | ggalimbe@cisco.com |
+----------+------------+-----------------------+----------------------------+
| Gert | Grammel | Juniper Networks | ggrammel@juniper.net |
+----------+------------+-----------------------+----------------------------+
| Gilad | Goldfarb | Facebook | giladg@fb.com |
+----------+------------+-----------------------+----------------------------+
| James | Powell | Consultant | james@dontusethiscode.com |
+----------+------------+-----------------------+----------------------------+
| Jeanluc | Auge | Orange | jeanluc.auge@orange.com |
+----------+------------+-----------------------+----------------------------+
| Mattia | Cantono | Politecnico di Torino | mattia.cantono@polito.it |
+----------+------------+-----------------------+----------------------------+
| Vittorio | Curri | Politecnico di Torino | vittorio.curri@polito.it |
+----------+------------+-----------------------+----------------------------+

111
docs/install.rst Normal file
View File

@@ -0,0 +1,111 @@
Installing GNPy
---------------
There are several methods on how to obtain GNPy.
The easiest option for a non-developer is probably going via our :ref:`Docker images<install-docker>`.
Developers are encouraged to install the :ref:`Python package in the same way as any other Python package<install-pip>`.
Note that this needs a :ref:`working installation of Python<install-python>`, for example :ref:`via Anaconda<install-anaconda>`.
.. _install-docker:
Using prebuilt Docker images
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Our `Docker images <https://hub.docker.com/r/telecominfraproject/oopt-gnpy>`_ contain everything needed to run all examples from this guide.
Docker transparently fetches the image over the network upon first use.
On Linux and Mac, run:
.. code-block:: shell-session
$ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy
root@bea050f186f7:/shared/example-data#
On Windows, launch from Powershell as:
.. code-block:: console
PS C:\> docker run -it --rm --volume ${PWD}:/shared telecominfraproject/oopt-gnpy
root@89784e577d44:/shared/example-data#
In both cases, a directory named ``example-data/`` will appear in your current working directory.
GNPy automaticallly populates it with example files from the current release.
Remove that directory if you want to start from scratch.
.. _install-python:
Using Python on your computer
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
**Note**: `gnpy` supports Python 3 only. Python 2 is not supported.
`gnpy` requires Python ≥3.6
**Note**: the `gnpy` maintainers strongly recommend the use of Anaconda for
managing dependencies.
It is recommended that you use a "virtual environment" when installing `gnpy`.
Do not install `gnpy` on your system Python.
.. _install-anaconda:
We recommend the use of the `Anaconda Python distribution <https://www.anaconda.com/download>`_ which comes with many scientific computing
dependencies pre-installed. Anaconda creates a base "virtual environment" for
you automatically. You can also create and manage your ``conda`` "virtual
environments" yourself (see:
https://conda.io/docs/user-guide/tasks/manage-environments.html)
To activate your Anaconda virtual environment, you may need to do the
following:
.. code-block:: shell-session
$ source /path/to/anaconda/bin/activate # activate Anaconda base environment
(base) $ # note the change to the prompt
You can check which Anaconda environment you are using with:
.. code-block:: shell-session
(base) $ conda env list # list all environments
# conda environments:
#
base * /src/install/anaconda3
(base) $ echo $CONDA_DEFAULT_ENV # show default environment
base
You can check your version of Python with the following. If you are using
Anaconda's Python 3, you should see similar output as below. Your results may
be slightly different depending on your Anaconda installation path and the
exact version of Python you are using.
.. code-block:: shell-session
$ which python # check which Python executable is used
/path/to/anaconda/bin/python
$ python -V # check your Python version
Python 3.6.5 :: Anaconda, Inc.
.. _install-pip:
Installing the Python package
*****************************
From within your Anaconda Python 3 environment, you can clone the master branch
of the `gnpy` repo and install it with:
.. code-block:: shell-session
$ git clone https://github.com/Telecominfraproject/oopt-gnpy # clone the repo
$ cd oopt-gnpy
$ python setup.py develop
To test that `gnpy` was successfully installed, you can run this command. If it
executes without a ``ModuleNotFoundError``, you have successfully installed
`gnpy`.
.. code-block:: shell-session
$ python -c 'import gnpy' # attempt to import gnpy
$ pytest # run tests

339
docs/json.rst Normal file
View File

@@ -0,0 +1,339 @@
JSON Input Files
================
GNPy uses a set of JSON files for modeling the network.
Some data (such as network topology or the service requests) can be also passed via :ref:`XLS files<excel-service-sheet>`.
Equipment Library
-----------------
Design and transmission parameters are defined in a dedicated json file. By
default, this information is read from `gnpy/example-data/eqpt_config.json
<gnpy/example-data/eqpt_config.json>`_. This file defines the equipment libraries that
can be customized (EDFAs, fibers, and transceivers).
It also defines the simulation parameters (spans, ROADMs, and the spectral
information to transmit.)
EDFA
~~~~
The EDFA equipment library is a list of supported amplifiers. New amplifiers
can be added and existing ones removed. Three different noise models are available:
1. ``'type_def': 'variable_gain'`` is a simplified model simulating a 2-coil EDFA with internal, input and output VOAs. The NF vs gain response is calculated accordingly based on the input parameters: ``nf_min``, ``nf_max``, and ``gain_flatmax``. It is not a simple interpolation but a 2-stage NF calculation.
2. ``'type_def': 'fixed_gain'`` is a fixed gain model. `NF == Cte == nf0` if `gain_min < gain < gain_flatmax`
3. ``'type_def': None`` is an advanced model. A detailed JSON configuration file is required (by default `gnpy/example-data/std_medium_gain_advanced_config.json <gnpy/example-data/std_medium_gain_advanced_config.json>`_). It uses a 3rd order polynomial where NF = f(gain), NF_ripple = f(frequency), gain_ripple = f(frequency), N-array dgt = f(frequency). Compared to the previous models, NF ripple and gain ripple are modelled.
For all amplifier models:
+------------------------+-----------+-----------------------------------------+
| field | type | description |
+========================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the amplifier in the|
| | | JSON/Excel template topology input file |
+------------------------+-----------+-----------------------------------------+
| ``out_voa_auto`` | (boolean) | auto_design feature to optimize the |
| | | amplifier output VOA. If true, output |
| | | VOA is present and will be used to push |
| | | amplifier gain to its maximum, within |
| | | EOL power margins. |
+------------------------+-----------+-----------------------------------------+
| ``allowed_for_design`` | (boolean) | If false, the amplifier will not be |
| | | picked by auto-design but it can still |
| | | be used as a manual input (from JSON or |
| | | Excel template topology files.) |
+------------------------+-----------+-----------------------------------------+
Fiber
~~~~~
The fiber library currently describes SSMF and NZDF but additional fiber types can be entered by the user following the same model:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the fiber in the |
| | | JSON or Excel template topology input |
| | | file |
+----------------------+-----------+-----------------------------------------+
| ``dispersion`` | (number) | (s.m-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``dispersion_slope`` | (number) | (s.m-1.m-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``gamma`` | (number) | 2pi.n2/(lambda*Aeff) (w-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``pmd_coef`` | (number) | Polarization mode dispersion (PMD) |
| | | coefficient. (s.sqrt(m)-1) |
+----------------------+-----------+-----------------------------------------+
Transceiver
~~~~~~~~~~~
The transceiver equipment library is a list of supported transceivers. New
transceivers can be added and existing ones removed at will by the user. It is
used to determine the service list path feasibility when running the
`path_request_run.py routine <gnpy/example-data/path_request_run.py>`_.
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | A unique name to ID the transceiver in |
| | | the JSON or Excel template topology |
| | | input file |
+----------------------+-----------+-----------------------------------------+
| ``frequency`` | (number) | Min/max as below. |
+----------------------+-----------+-----------------------------------------+
| ``mode`` | (number) | A list of modes supported by the |
| | | transponder. New modes can be added at |
| | | will by the user. The modes are specific|
| | | to each transponder type_variety. |
| | | Each mode is described as below. |
+----------------------+-----------+-----------------------------------------+
The modes are defined as follows:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``format`` | (string) | a unique name to ID the mode |
+----------------------+-----------+-----------------------------------------+
| ``baud_rate`` | (number) | in Hz |
+----------------------+-----------+-----------------------------------------+
| ``OSNR`` | (number) | min required OSNR in 0.1nm (dB) |
+----------------------+-----------+-----------------------------------------+
| ``bit_rate`` | (number) | in bit/s |
+----------------------+-----------+-----------------------------------------+
| ``roll_off`` | (number) | Pure number between 0 and 1. TX signal |
| | | roll-off shape. Used by Raman-aware |
| | | simulation code. |
+----------------------+-----------+-----------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-----------------------------------------+
| ``cost`` | (number) | Arbitrary unit |
+----------------------+-----------+-----------------------------------------+
Simulation parameters
~~~~~~~~~~~~~~~~~~~~~
Auto-design automatically creates EDFA amplifier network elements when they are
missing, after a fiber, or between a ROADM and a fiber. This auto-design
functionality can be manually and locally deactivated by introducing a ``Fused``
network element after a ``Fiber`` or a ``Roadm`` that doesn't need amplification.
The amplifier is chosen in the EDFA list of the equipment library based on
gain, power, and NF criteria. Only the EDFA that are marked
``'allowed_for_design': true`` are considered.
For amplifiers defined in the topology JSON input but whose ``gain = 0``
(placeholder), auto-design will set its gain automatically: see ``power_mode`` in
the ``Spans`` library to find out how the gain is calculated.
Span
~~~~
Span configuration is not a list (which may change
in later releases) and the user can only modify the value of existing
parameters:
+-------------------------------------+-----------+---------------------------------------------+
| field | type | description |
+=====================================+===========+=============================================+
| ``power_mode`` | (boolean) | If false, gain mode. Auto-design sets |
| | | amplifier gain = preceding span loss, |
| | | unless the amplifier exists and its |
| | | gain > 0 in the topology input JSON. |
| | | If true, power mode (recommended for |
| | | auto-design and power sweep.) |
| | | Auto-design sets amplifier power |
| | | according to delta_power_range. If the |
| | | amplifier exists with gain > 0 in the |
| | | topology JSON input, then its gain is |
| | | translated into a power target/channel. |
| | | Moreover, when performing a power sweep |
| | | (see ``power_range_db`` in the SI |
| | | configuration library) the power sweep |
| | | is performed w/r/t this power target, |
| | | regardless of preceding amplifiers |
| | | power saturation/limitations. |
+-------------------------------------+-----------+---------------------------------------------+
| ``delta_power_range_db`` | (number) | Auto-design only, power-mode |
| | | only. Specifies the [min, max, step] |
| | | power excursion/span. It is a relative |
| | | power excursion w/r/t the |
| | | power_dbm + power_range_db |
| | | (power sweep if applicable) defined in |
| | | the SI configuration library. This |
| | | relative power excursion is = 1/3 of |
| | | the span loss difference with the |
| | | reference 20 dB span. The 1/3 slope is |
| | | derived from the GN model equations. |
| | | For example, a 23 dB span loss will be |
| | | set to 1 dB more power than a 20 dB |
| | | span loss. The 20 dB reference spans |
| | | will *always* be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all |
| | | spans, use `[0, 0, 0]`. All spans will |
| | | be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all spans |
| | | and 3 dB more power just for the longest |
| | | spans: `[0, 3, 3]`. The longest spans are |
| | | set to |
| | | power = power_dbm + power_range_db + 3. |
| | | To configure a 4 dB power range across |
| | | all spans in 0.5 dB steps: `[-2, 2, 0.5]`. |
| | | A 17 dB span is set to |
| | | power = power_dbm + power_range_db - 1, |
| | | a 20 dB span to |
| | | power = power_dbm + power_range_db and |
| | | a 23 dB span to |
| | | power = power_dbm + power_range_db + 1 |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_fiber_lineic_loss_for_raman`` | (number) | Maximum linear fiber loss for Raman |
| | | amplification use. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_length`` | (number) | Split fiber lengths > max_length. |
| | | Interest to support high level |
| | | topologies that do not specify in line |
| | | amplification sites. For example the |
| | | CORONET_Global_Topology.xlsx defines |
| | | links > 1000km between 2 sites: it |
| | | couldn't be simulated if these links |
| | | were not split in shorter span lengths. |
+-------------------------------------+-----------+---------------------------------------------+
| ``length_unit`` | "m"/"km" | Unit for ``max_length``. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_loss`` | (number) | Not used in the current code |
| | | implementation. |
+-------------------------------------+-----------+---------------------------------------------+
| ``padding`` | (number) | In dB. Min span loss before putting an |
| | | attenuator before fiber. Attenuator |
| | | value |
| | | Fiber.att_in = max(0, padding - span_loss). |
| | | Padding can be set manually to reach a |
| | | higher padding value for a given fiber |
| | | by filling in the Fiber/params/att_in |
| | | field in the topology json input [1] |
| | | but if span_loss = length * loss_coef |
| | | + att_in + con_in + con_out < padding, |
| | | the specified att_in value will be |
| | | completed to have span_loss = padding. |
| | | Therefore it is not possible to set |
| | | span_loss < padding. |
+-------------------------------------+-----------+---------------------------------------------+
| ``EOL`` | (number) | All fiber span loss ageing. The value |
| | | is added to the con_out (fiber output |
| | | connector). So the design and the path |
| | | feasibility are performed with |
| | | span_loss + EOL. EOL cannot be set |
| | | manually for a given fiber span |
| | | (workaround is to specify higher |
| | | ``con_out`` loss for this fiber). |
+-------------------------------------+-----------+---------------------------------------------+
| ``con_in``, | (number) | Default values if Fiber/params/con_in/out |
| ``con_out`` | | is None in the topology input |
| | | description. This default value is |
| | | ignored if a Fiber/params/con_in/out |
| | | value is input in the topology for a |
| | | given Fiber. |
+-------------------------------------+-----------+---------------------------------------------+
.. code-block:: json
{
"uid": "fiber (A1->A2)",
"type": "Fiber",
"type_variety": "SSMF",
"params":
{
"length": 120.0,
"loss_coef": 0.2,
"length_units": "km",
"att_in": 0,
"con_in": 0,
"con_out": 0
}
}
ROADM
~~~~~
The user can only modify the value of existing parameters:
+--------------------------+-----------+---------------------------------------------+
| field | type | description |
+==========================+===========+=============================================+
| ``target_pch_out_db`` | (number) | Auto-design sets the ROADM egress channel |
| | | power. This reflects typical control loop |
| | | algorithms that adjust ROADM losses to |
| | | equalize channels (eg coming from different |
| | | ingress direction or add ports) |
| | | This is the default value |
| | | Roadm/params/target_pch_out_db if no value |
| | | is given in the ``Roadm`` element in the |
| | | topology input description. |
| | | This default value is ignored if a |
| | | params/target_pch_out_db value is input in |
| | | the topology for a given ROADM. |
+--------------------------+-----------+---------------------------------------------+
| ``add_drop_osnr`` | (number) | OSNR contribution from the add/drop ports |
+--------------------------+-----------+---------------------------------------------+
| ``pmd`` | (number) | Polarization mode dispersion (PMD). (s) |
+--------------------------+-----------+---------------------------------------------+
| ``restrictions`` | (dict of | If non-empty, keys ``preamp_variety_list`` |
| | strings) | and ``booster_variety_list`` represent |
| | | list of ``type_variety`` amplifiers which |
| | | are allowed for auto-design within ROADM's |
| | | line degrees. |
| | | |
| | | If no booster should be placed on a degree, |
| | | insert a ``Fused`` node on the degree |
| | | output. |
+--------------------------+-----------+---------------------------------------------+
SpectralInformation
~~~~~~~~~~~~~~~~~~~
The user can only modify the value of existing parameters. It defines a spectrum of N
identical carriers. While the code libraries allow for different carriers and
power levels, the current user parametrization only allows one carrier type and
one power/channel definition.
+----------------------+-----------+-------------------------------------------+
| field | type | description |
+======================+===========+===========================================+
| ``f_min``, | (number) | In Hz. Carrier min max excursion. |
| ``f_max`` | | |
+----------------------+-----------+-------------------------------------------+
| ``baud_rate`` | (number) | In Hz. Simulated baud rate. |
+----------------------+-----------+-------------------------------------------+
| ``spacing`` | (number) | In Hz. Carrier spacing. |
+----------------------+-----------+-------------------------------------------+
| ``roll_off`` | (number) | Pure number between 0 and 1. TX signal |
| | | roll-off shape. Used by Raman-aware |
| | | simulation code. |
+----------------------+-----------+-------------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-------------------------------------------+
| ``power_dbm`` | (number) | Reference channel power. In gain mode |
| | | (see spans/power_mode = false), all gain |
| | | settings are offset w/r/t this reference |
| | | power. In power mode, it is the |
| | | reference power for |
| | | Spans/delta_power_range_db. For example, |
| | | if delta_power_range_db = `[0,0,0]`, the |
| | | same power=power_dbm is launched in every |
| | | spans. The network design is performed |
| | | with the power_dbm value: even if a |
| | | power sweep is defined (see after) the |
| | | design is not repeated. |
+----------------------+-----------+-------------------------------------------+
| ``power_range_db`` | (number) | Power sweep excursion around power_dbm. |
| | | It is not the min and max channel power |
| | | values! The reference power becomes: |
| | | power_range_db + power_dbm. |
+----------------------+-----------+-------------------------------------------+
| ``sys_margins`` | (number) | In dB. Added margin on min required |
| | | transceiver OSNR. |
+----------------------+-----------+-------------------------------------------+

View File

@@ -1,18 +1,18 @@
The QoT estimation in the PSE framework of TIP-OOPT
=======================================================
Physical Model used in GNPy
===========================
QoT-E including ASE noise and NLI accumulation
----------------------------------------------
The operations of PSE simulative framework are based on the capability to estimate the QoT of one
or more channels operating lightpaths over a given network route. For
backbone transport networks, we can suppose that transceivers are
operating polarization-division-multiplexed multilevel modulation
formats with DSP-based coherent receivers, including equalization. For
the optical links, we focus on state-of-the-art amplified and
uncompensated fiber links, connecting network nodes including ROADMs,
where add and drop operations on data traffic are performed. In such a
transmission scenario, it is well accepted
The operations of PSE simulative framework are based on the capability to
estimate the QoT of one or more channels operating lightpaths over a given
network route. For backbone transport networks, we can suppose that
transceivers are operating polarization-division-multiplexed multilevel
modulation formats with DSP-based coherent receivers, including equalization.
For the optical links, we focus on state-of-the-art amplified and uncompensated
fiber links, connecting network nodes including ROADMs, where add and drop
operations on data traffic are performed. In such a transmission scenario, it
is well accepted
:cite:`vacondio_nonlinear_2012,bononi_modeling_2012,carena_modeling_2012,mecozzi_nonlinear_2012,secondini_analytical_2012,johannisson_perturbation_2013,dar_properties_2013,serena_alternative_2013,secondini_achievable_2013,poggiolini_gn-model_2014,dar_accumulation_2014,poggiolini_analytical_2011,savory_approximations_2013,bononi_single-_2013,johannisson_modeling_2014`
to assume that transmission performances are limited by the amplified
spontaneous emission (ASE) noise generated by optical amplifiers and and
@@ -49,7 +49,6 @@ filtering effects. Note that for state-of-the art equipment, filtering
effects can be typically neglected over routes with few hops
:cite:`rahman_mitigation_2014,foggi_overcoming_2015`.
To properly estimate :math:`P_{\text{ch}}` and :math:`P_{\text{ASE}}`
the transmitted power at the beginning of the considered route must be
known, and losses and amplifiers gain and noise figure, including their
@@ -62,8 +61,10 @@ models have been proposed and validated in the technical literature
The decision about which model to test within the PSE activities was
driven by requirements of the entire PSE framework:
i. the model must be *local*, i.e., related individually to each network element (i.e. fiber span) generating NLI, independently of preceding and subsequent elements; and
ii. the related computational time must be compatible with interactive operations.
i. the model must be *local*, i.e., related individually to each network
element (i.e. fiber span) generating NLI, independently of preceding and
subsequent elements; and ii. the related computational time must be compatible
with interactive operations.
So, the choice fell on the Gaussian Noise
(GN) model with incoherent accumulation of NLI over fiber spans
@@ -79,46 +80,67 @@ for fiber types with chromatic dispersion roughly larger than 4
ps/nm/km, the analytical approximation ensures an excellent accuracy
with a computational time compatible with real-time operations.
The Gaussian Noise Model to evaluate the NLI
--------------------------------------------
As previously stated, fiber propagation of multilevel modulation formats relying on the polarization-division-multiplexing
generates impairments that can be summarized as a disturbance called nonlinear interference (NLI),
when exploiting a DSP-based coherent receiver, as in all state-of-the-art equipment.
From a practical point of view, the NLI can be modeled as an additive
Gaussian random process added by each fiber span, and whose strength depends on the cube of the input power spectral density and
on the fiber-span parameters.
Since the introduction in the market in 2007 of the first transponder based on such a transmission technique, the scientific
community has intensively worked to define the propagation behavior of such a trasnmission technique.
First, the role of in-line chromatic dispersion compensation has been investigated, deducing that besides being
not essential, it is indeed detrimental for performances :cite:`curri_dispersion_2008`.
Then, it has been observed that the fiber propagation impairments are practically summarized by the sole NLI, being all the other
phenomena compensated for by the blind equalizer implemented in the receiver DSP :cite:`carena_statistical_2010`.
Once these assessments have been accepted by the community, several prestigious research groups have started to work
on deriving analytical models able to estimating the NLI accumulation, and consequentially the generalized SNR that sets the BER,
according to the transponder BER vs. SNR performance.
Many models delivering different levels of accuracy have been developed and validated. As previously clarified, for the purposes
of the PSE framework, the GN-model with incoherent accumulation of NLI over fiber spans has been selected as adequate.
The reason for such a choice is first such a model being a "local" model, so related to each fiber spans, independently of
the preceding and succeeding network elements. The other model characteristic driving the choice is
the availability of a closed form for the model, so permitting a real-time evaluation, as required by the PSE framework.
For a detailed derivation of the model, please refer to :cite:`poggiolini_analytical_2011`, while a qualitative description
can be summarized as in the following.
The GN-model assumes that the channel comb propagating in the fiber is well approximated by unpolarized spectrally shaped
Gaussian noise. In such a scenario, supposing to rely - as in state-of-the-art equipment - on a receiver entirely compensating for linear propagation effects, propagation in the fiber only excites the four-wave mixing (FWM) process among the continuity of
the tones occupying the bandwidth. Such a FWM generates an unpolarized complex Gaussian disturbance in each spectral slot
that can be easily evaluated extending the FWM theory from a set of discrete tones - the standard FWM theory introduced back in the 90s by Inoue :cite:`Innoue-FWM`- to a continuity of tones, possibly spectrally shaped.
Signals propagating in the fiber are not equivalent to Gaussian noise, but thanks to the absence of in-line compensation for choromatic dispersion,
the become so, over short distances.
So, the Gaussian noise model with incoherent accumulation of NLI has estensively proved to be a quick yet accurate and conservative tool
to estimate propagation impairments of fiber propagation.
Note that the GN-model has not been derived with the aim of an *exact* performance estimation, but to pursue a conservative performance prediction. So, considering these characteristics, and the fact that the NLI is always a secondary effect with respect to the ASE noise accumulation, and - most importantly - that typically linear propagation parameters (losses, gains and noise figures) are known within
a variation range, a QoT estimator based on the GN model is adequate to deliver performance predictions in terms of a reasonable SNR range, rather than an exact value.
As final remark, it must be clarified that the GN-model is adequate to be used when relying on a relatively narrow bandwidth up to few THz. When exceeding such a bandwidth occupation, the GN-model must be generalized introducing the interaction with the Stimulated
Raman Scattering in order to give a proper estimation for all channels :cite:`cantono2018modeling`.
This will be the main upgrade required within the PSE framework.
As previously stated, fiber propagation of multilevel modulation formats
relying on the polarization-division-multiplexing generates impairments that
can be summarized as a disturbance called nonlinear interference (NLI), when
exploiting a DSP-based coherent receiver, as in all state-of-the-art equipment.
From a practical point of view, the NLI can be modeled as an additive Gaussian
random process added by each fiber span, and whose strength depends on the cube
of the input power spectral density and on the fiber-span parameters.
Since the introduction in the market in 2007 of the first transponder based on
such a transmission technique, the scientific community has intensively worked
to define the propagation behavior of such a trasnmission technique. First,
the role of in-line chromatic dispersion compensation has been investigated,
deducing that besides being not essential, it is indeed detrimental for
performances :cite:`curri_dispersion_2008`. Then, it has been observed that
the fiber propagation impairments are practically summarized by the sole NLI,
being all the other phenomena compensated for by the blind equalizer
implemented in the receiver DSP :cite:`carena_statistical_2010`. Once these
assessments have been accepted by the community, several prestigious research
groups have started to work on deriving analytical models able to estimating
the NLI accumulation, and consequentially the generalized SNR that sets the
BER, according to the transponder BER vs. SNR performance. Many models
delivering different levels of accuracy have been developed and validated. As
previously clarified, for the purposes of the PSE framework, the GN-model with
incoherent accumulation of NLI over fiber spans has been selected as adequate.
The reason for such a choice is first such a model being a "local" model, so
related to each fiber spans, independently of the preceding and succeeding
network elements. The other model characteristic driving the choice is the
availability of a closed form for the model, so permitting a real-time
evaluation, as required by the PSE framework. For a detailed derivation of the
model, please refer to :cite:`poggiolini_analytical_2011`, while a qualitative
description can be summarized as in the following. The GN-model assumes that
the channel comb propagating in the fiber is well approximated by unpolarized
spectrally shaped Gaussian noise. In such a scenario, supposing to rely - as in
state-of-the-art equipment - on a receiver entirely compensating for linear
propagation effects, propagation in the fiber only excites the four-wave mixing
(FWM) process among the continuity of the tones occupying the bandwidth. Such a
FWM generates an unpolarized complex Gaussian disturbance in each spectral slot
that can be easily evaluated extending the FWM theory from a set of discrete
tones - the standard FWM theory introduced back in the 90s by Inoue
:cite:`Innoue-FWM`- to a continuity of tones, possibly spectrally shaped.
Signals propagating in the fiber are not equivalent to Gaussian noise, but
thanks to the absence of in-line compensation for choromatic dispersion, the
become so, over short distances. So, the Gaussian noise model with incoherent
accumulation of NLI has estensively proved to be a quick yet accurate and
conservative tool to estimate propagation impairments of fiber propagation.
Note that the GN-model has not been derived with the aim of an *exact*
performance estimation, but to pursue a conservative performance prediction.
So, considering these characteristics, and the fact that the NLI is always a
secondary effect with respect to the ASE noise accumulation, and - most
importantly - that typically linear propagation parameters (losses, gains and
noise figures) are known within a variation range, a QoT estimator based on the
GN model is adequate to deliver performance predictions in terms of a
reasonable SNR range, rather than an exact value. As final remark, it must be
clarified that the GN-model is adequate to be used when relying on a relatively
narrow bandwidth up to few THz. When exceeding such a bandwidth occupation, the
GN-model must be generalized introducing the interaction with the Stimulated
Raman Scattering in order to give a proper estimation for all channels
:cite:`cantono2018modeling`. This will be the main upgrade required within the
PSE framework.
.. bibliography:: biblio.bib

View File

@@ -1,70 +0,0 @@
gnpy\.core package
==================
Submodules
----------
gnpy\.core\.elements module
---------------------------
.. automodule:: gnpy.core.elements
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.execute module
--------------------------
.. automodule:: gnpy.core.execute
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.info module
-----------------------
.. automodule:: gnpy.core.info
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.network module
--------------------------
.. automodule:: gnpy.core.network
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.node module
-----------------------
.. automodule:: gnpy.core.node
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.units module
------------------------
.. automodule:: gnpy.core.units
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.utils module
------------------------
.. automodule:: gnpy.core.utils
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: gnpy.core
:members:
:undoc-members:
:show-inheritance:

View File

@@ -1,17 +0,0 @@
gnpy package
============
Subpackages
-----------
.. toctree::
gnpy.core
Module contents
---------------
.. automodule:: gnpy
:members:
:undoc-members:
:show-inheritance:

View File

@@ -1,7 +0,0 @@
gnpy
====
.. toctree::
:maxdepth: 4
gnpy

Binary file not shown.

View File

@@ -1,11 +0,0 @@
REGIONS = europe asia conus
TARGETS = $(foreach region,$(REGIONS),coronet.$(region).json)
all: $(TARGETS)
$(TARGETS): convert.py CORONET_Global_Topology.xls
python $< -f $(subst .json,,$(subst coronet.,,$@)) > $@
.PHONY: clean
clean:
-rm $(TARGETS) -f

View File

@@ -1,143 +0,0 @@
{
"networks": {
"network": [
{
"network-types": {
"tip-oopt-pse": {}
},
"network-id": "pt-to-pt",
"node": [
{
"node-id": "M_KMA",
"type":"roadm",
"termination-point": [
{
"tp-id": "1-2-1"
}
]
},
{
"node-id": "T_CAS",
"type":"roadm",
"termination-point": [
{
"tp-id": "2-1-1"
},
{
"tp-id": "2-3-1"
}
]
},
{
"node-id": "LA",
"type":"ila",
"termination-point": [
{
"tp-id": "3-2-1"
},
{
"tp-id": "3-4-1"
}
]
},
{
"node-id": "SR",
"type":"fused",
"termination-point": [
{
"tp-id": "4-3-1"
}
]
},
{
"node-id": "C",
"type":"ila",
"termination-point": [
{
"tp-id": "5-6-1"
}
]
},
{
"node-id": "N_KBE",
"type":"roadm",
"termination-point": [
{
"tp-id": "6-5-1"
},
{
"tp-id": "6-7-1"
}
]
},
{
"node-id": "N_KBA",
"type":"roadm",
"termination-point": [
{
"tp-id": "7-6-1"
}
]
}
],
"link": [
{
"link-id": "M_KMA,1-2-1,T_CAS,2-1-1",
"source": {
"source-node": "M_KMA",
"source-tp": "1-2-1"
}
"destination": {
"dest-node": "T_CAS",
"dest-tp": "2-1-1"
}
},
{
"link-id": "T_CAS,2-3-1,LA,3-2-1",
"source": {
"source-node": "T_CAS",
"source-tp": "2-3-1"
}
"destination": {
"dest-node": "LA",
"dest-tp": "3-2-1"
}
},
{
"link-id": "LA,3-4-1,SR,4-3-1",
"source": {
"source-node": "LA",
"source-tp": "3-4-1"
}
"destination": {
"dest-node": "SR",
"dest-tp": "4-3-1"
}
},
{
"link-id": "C,5-6-1,N_KBE,6-5-1",
"source": {
"source-node": "C",
"source-tp": "5-6-1"
}
"destination": {
"dest-node": "N_KBE",
"dest-tp": "6-5-1"
}
},
{
"link-id": "N_KBE,6-7-1,N_KBA,7-6-1",
"source": {
"source-node": "N_KBE",
"source-tp": "6-7-1"
}
"destination": {
"dest-node": "N_KBA",
"dest-tp": "7-6-1"
}
}
]
}
]
}
}

View File

@@ -1,157 +0,0 @@
{
"network_name": "pt to pt",
"nodes_elements":
[
{
"id":"M_KMA",
"type":"ROADM",
"metadata": {
"city":"M",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"T_CAS",
"type":"ROADM",
"metadata": {
"city":"T",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"LA",
"type":"ILA",
"metadata": {
"city":"LA",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"SR",
"type":"fused",
"metadata": {
"city":"SR",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"C",
"type":"ILA",
"metadata": {
"city":"C",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"N_KBE",
"type":"ROADM",
"metadata": {
"city":"N",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
{
"id":"N_KBA",
"type":"ROADM",
"metadata": {
"city":"N",
"region":"RLD",
"latitude":0,
"longitude":0
}
},
],
"OTS_elements":[
{
"id":1,
"source_id":"M_KMA",
"dest_id":"T_CAS",
"parameters_cable":{
"units":"km",
"length":60,
"id":"F060",
"type":"G652"
},
"parameters_east":{
"con_in":0.5,
"con_out":0.5,
"loss":16,
"pmd":2,
"fo_id":5
},
"parameters_west":{
"con_in":0.5,
"con_out":0.5,
"loss":15,
"pmd":2,
"fo_id":6
}
},
{
"id":2,
"source_id":"T_CAS",
"dest_id":"LA",
"parameters_cable":{
},
"parameters_east":{
},
"parameters_west":{
}
},
{
"id":3,
"source_id":"LA",
"dest_id":"SR",
"parameters_cable":{
},
"parameters_east":{
},
"parameters_west":{
}
},
{
"id":3,
"source_id":"SR",
"dest_id":"C",
"parameters_cable":{
},
"parameters_east":{
},
"parameters_west":{
}
},
{
"id":5,
"source_id":"C",
"dest_id":"N_KBE",
"parameters_cable":{
},
"parameters_east":{
},
"parameters_west":{
}
},
{
"id":6,
"source_id":"N_KBE",
"dest_id":"N_KBA",
"parameters_cable":{
},
"parameters_east":{
},
"parameters_west":{
}
},
]}

View File

@@ -1,162 +0,0 @@
#!/usr/bin/env python3
from sys import exit
try:
from xlrd import open_workbook
except ModuleNotFoundError:
exit('Required: `pip install xlrd`')
from argparse import ArgumentParser
from collections import namedtuple, Counter
from itertools import chain
from json import dumps
from uuid import uuid4
import math
import numpy as np
output_json_file_name = 'coronet_conus_example.json'
Node = namedtuple('Node', 'city state country region latitude longitude')
class Link(namedtuple('Link', 'from_city to_city distance distance_units')):
def __new__(cls, from_city, to_city, distance, distance_units='km'):
return super().__new__(cls, from_city, to_city, distance, distance_units)
def define_span_range(min_span, max_span, nspans):
srange = (max_span - min_span) + min_span*np.random.rand(nspans)
return srange
def amp_spacings(min_span,max_span,length):
nspans = math.ceil(length/100)
spans = define_span_range(min_span, max_span, nspans)
tot = spans.sum()
delta = length -tot
if delta > 0 and delta < 25:
ind = np.where(np.min(spans))
spans[ind] = spans[ind] + delta
elif delta >= 25 and delta < 40:
spans = spans + delta/float(nspans)
elif delta > 40 and delta < 100:
spans = np.append(spans,delta)
elif delta > 100:
spans = np.append(spans, [delta/2, delta/2])
elif delta < 0:
spans = spans + delta/float(nspans)
return list(spans)
def parse_excel(args):
with open_workbook(args.workbook) as wb:
nodes_sheet = wb.sheet_by_name('Nodes')
links_sheet = wb.sheet_by_name('Links')
# sanity check
header = [x.value.strip() for x in nodes_sheet.row(4)]
expected = ['City', 'State', 'Country', 'Region', 'Latitude', 'Longitude']
if header != expected:
raise ValueError(f'Malformed header on Nodes sheet: {header} != {expected}')
nodes = []
for row in all_rows(nodes_sheet, start=5):
nodes.append(Node(*(x.value for x in row)))
# sanity check
header = [x.value.strip() for x in links_sheet.row(4)]
expected = ['Node A', 'Node Z', 'Distance (km)']
if header != expected:
raise ValueError(f'Malformed header on Nodes sheet: {header} != {expected}')
links = []
for row in all_rows(links_sheet, start=5):
links.append(Link(*(x.value for x in row)))
# sanity check
all_cities = Counter(n.city for n in nodes)
if len(all_cities) != len(nodes):
ValueError(f'Duplicate city: {all_cities}')
if any(ln.from_city not in all_cities or
ln.to_city not in all_cities for ln in links):
ValueError(f'Bad link.')
return nodes, links
parser = ArgumentParser()
parser.add_argument('workbook', nargs='?', default='CORONET_Global_Topology.xls')
parser.add_argument('-f', '--filter-region', action='append', default=[])
all_rows = lambda sh, start=0: (sh.row(x) for x in range(start, sh.nrows))
def midpoint(city_a, city_b):
lats = city_a.latitude, city_b.latitude
longs = city_a.longitude, city_b.longitude
return {
'latitude': sum(lats) / 2,
'longitude': sum(longs) / 2,
}
if __name__ == '__main__':
args = parser.parse_args()
nodes, links = parse_excel(args)
if args.filter_region:
nodes = [n for n in nodes if n.region.lower() in args.filter_region]
cities = {n.city for n in nodes}
links = [lnk for lnk in links if lnk.from_city in cities and
lnk.to_city in cities]
cities = {lnk.from_city for lnk in links} | {lnk.to_city for lnk in links}
nodes = [n for n in nodes if n.city in cities]
nodes_by_city = {n.city: n for n in nodes}
data = {
'elements':
[{'uid': f'trx {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Transceiver'}
for x in nodes] +
[{'uid': f'roadm {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes] +
[{'uid': f'fiber ({x.from_city}{x.to_city})',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'params': {'length': round(x.distance, 3),
'length_units': x.distance_units,
'loss_coef': 0.2,
'dispersion': 16.7E-6,
'gamma': 1.27E-3}
}
for x in links],
'connections':
list(chain.from_iterable(zip( # put bidi next to each other
[{'from_node': f'roadm {x.from_city}',
'to_node': f'fiber ({x.from_city}{x.to_city})'}
for x in links],
[{'from_node': f'fiber ({x.from_city}{x.to_city})',
'to_node': f'roadm {x.from_city}'}
for x in links])))
+
list(chain.from_iterable(zip(
[{'from_node': f'fiber ({x.from_city}{x.to_city})',
'to_node': f'roadm {x.to_city}'}
for x in links],
[{'from_node': f'roadm {x.to_city}',
'to_node': f'fiber ({x.from_city}{x.to_city})'}
for x in links])))
+
list(chain.from_iterable(zip(
[{'from_node': f'trx {x.city}',
'to_node': f'roadm {x.city}'}
for x in nodes],
[{'from_node': f'roadm {x.city}',
'to_node': f'trx {x.city}'}
for x in nodes])))
}
print(dumps(data, indent=2))
with open(output_json_file_name,'w') as edfa_json_file:
edfa_json_file.write(dumps(data, indent=2))

View File

@@ -1,542 +0,0 @@
{
"elements": [
{
"uid": "Bangkok",
"metadata": {
"location": {
"city": "Bangkok",
"region": "Asia",
"latitude": 13.73,
"longitude": 100.5
}
},
"type": "Transceiver"
},
{
"uid": "Beijing",
"metadata": {
"location": {
"city": "Beijing",
"region": "Asia",
"latitude": 39.92999979,
"longitude": 116.4000013
}
},
"type": "Transceiver"
},
{
"uid": "Delhi",
"metadata": {
"location": {
"city": "Delhi",
"region": "Asia",
"latitude": 28.6700003,
"longitude": 77.2099989
}
},
"type": "Transceiver"
},
{
"uid": "Hong_Kong",
"metadata": {
"location": {
"city": "Hong_Kong",
"region": "Asia",
"latitude": 22.267,
"longitude": 114.14
}
},
"type": "Transceiver"
},
{
"uid": "Honolulu",
"metadata": {
"location": {
"city": "Honolulu",
"region": "Asia",
"latitude": 21.3199996,
"longitude": -157.800003
}
},
"type": "Transceiver"
},
{
"uid": "Mumbai",
"metadata": {
"location": {
"city": "Mumbai",
"region": "Asia",
"latitude": 18.9599987,
"longitude": 72.8199999
}
},
"type": "Transceiver"
},
{
"uid": "Seoul",
"metadata": {
"location": {
"city": "Seoul",
"region": "Asia",
"latitude": 37.56000108,
"longitude": 126.9899988
}
},
"type": "Transceiver"
},
{
"uid": "Shanghai",
"metadata": {
"location": {
"city": "Shanghai",
"region": "Asia",
"latitude": 31.23,
"longitude": 121.47
}
},
"type": "Transceiver"
},
{
"uid": "Singapore",
"metadata": {
"location": {
"city": "Singapore",
"region": "Asia",
"latitude": 1.299999907,
"longitude": 103.8499992
}
},
"type": "Transceiver"
},
{
"uid": "Sydney",
"metadata": {
"location": {
"city": "Sydney",
"region": "Asia",
"latitude": -33.86999896,
"longitude": 151.2100066
}
},
"type": "Transceiver"
},
{
"uid": "Taipei",
"metadata": {
"location": {
"city": "Taipei",
"region": "Asia",
"latitude": 25.0200005,
"longitude": 121.449997
}
},
"type": "Transceiver"
},
{
"uid": "Tokyo",
"metadata": {
"location": {
"city": "Tokyo",
"region": "Asia",
"latitude": 35.6699986,
"longitude": 139.770004
}
},
"type": "Transceiver"
},
{
"uid": "fiber (Bangkok \u2192 Delhi)",
"metadata": {
"length": 3505.95,
"units": "km",
"location": {
"latitude": 21.20000015,
"longitude": 88.85499945000001
}
},
"type": "Fiber"
},
{
"uid": "fiber (Bangkok \u2192 Hong_Kong)",
"metadata": {
"length": 2070.724,
"units": "km",
"location": {
"latitude": 17.9985,
"longitude": 107.32
}
},
"type": "Fiber"
},
{
"uid": "fiber (Beijing \u2192 Seoul)",
"metadata": {
"length": 1146.124,
"units": "km",
"location": {
"latitude": 38.745000434999994,
"longitude": 121.69500005
}
},
"type": "Fiber"
},
{
"uid": "fiber (Beijing \u2192 Shanghai)",
"metadata": {
"length": 1284.465,
"units": "km",
"location": {
"latitude": 35.579999895,
"longitude": 118.93500065
}
},
"type": "Fiber"
},
{
"uid": "fiber (Delhi \u2192 Mumbai)",
"metadata": {
"length": 1402.141,
"units": "km",
"location": {
"latitude": 23.8149995,
"longitude": 75.0149994
}
},
"type": "Fiber"
},
{
"uid": "fiber (Hong_Kong \u2192 Shanghai)",
"metadata": {
"length": 1480.406,
"units": "km",
"location": {
"latitude": 26.7485,
"longitude": 117.805
}
},
"type": "Fiber"
},
{
"uid": "fiber (Hong_Kong \u2192 Sydney)",
"metadata": {
"length": 8856.6,
"units": "km",
"location": {
"latitude": -5.801499479999999,
"longitude": 132.67500330000001
}
},
"type": "Fiber"
},
{
"uid": "fiber (Hong_Kong \u2192 Taipei)",
"metadata": {
"length": 966.177,
"units": "km",
"location": {
"latitude": 23.64350025,
"longitude": 117.79499849999999
}
},
"type": "Fiber"
},
{
"uid": "fiber (Honolulu \u2192 Sydney)",
"metadata": {
"length": 9808.616,
"units": "km",
"location": {
"latitude": -6.274999679999999,
"longitude": -3.294998199999995
}
},
"type": "Fiber"
},
{
"uid": "fiber (Honolulu \u2192 Taipei)",
"metadata": {
"length": 9767.013,
"units": "km",
"location": {
"latitude": 23.17000005,
"longitude": -18.175003000000004
}
},
"type": "Fiber"
},
{
"uid": "fiber (Mumbai \u2192 Singapore)",
"metadata": {
"length": 4692.708,
"units": "km",
"location": {
"latitude": 10.1299993035,
"longitude": 88.33499954999999
}
},
"type": "Fiber"
},
{
"uid": "fiber (Seoul \u2192 Tokyo)",
"metadata": {
"length": 1391.085,
"units": "km",
"location": {
"latitude": 36.614999839999996,
"longitude": 133.3800014
}
},
"type": "Fiber"
},
{
"uid": "fiber (Singapore \u2192 Sydney)",
"metadata": {
"length": 7562.331,
"units": "km",
"location": {
"latitude": -16.2849995265,
"longitude": 127.5300029
}
},
"type": "Fiber"
},
{
"uid": "fiber (Taipei \u2192 Tokyo)",
"metadata": {
"length": 2537.345,
"units": "km",
"location": {
"latitude": 30.344999549999997,
"longitude": 130.6100005
}
},
"type": "Fiber"
}
],
"connections": [
{
"from_node": "Bangkok",
"to_node": "fiber (Bangkok \u2192 Delhi)"
},
{
"from_node": "fiber (Bangkok \u2192 Delhi)",
"to_node": "Bangkok"
},
{
"from_node": "Bangkok",
"to_node": "fiber (Bangkok \u2192 Hong_Kong)"
},
{
"from_node": "fiber (Bangkok \u2192 Hong_Kong)",
"to_node": "Bangkok"
},
{
"from_node": "Beijing",
"to_node": "fiber (Beijing \u2192 Seoul)"
},
{
"from_node": "fiber (Beijing \u2192 Seoul)",
"to_node": "Beijing"
},
{
"from_node": "Beijing",
"to_node": "fiber (Beijing \u2192 Shanghai)"
},
{
"from_node": "fiber (Beijing \u2192 Shanghai)",
"to_node": "Beijing"
},
{
"from_node": "Delhi",
"to_node": "fiber (Delhi \u2192 Mumbai)"
},
{
"from_node": "fiber (Delhi \u2192 Mumbai)",
"to_node": "Delhi"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Shanghai)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Shanghai)",
"to_node": "Hong_Kong"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Sydney)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Sydney)",
"to_node": "Hong_Kong"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Taipei)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Taipei)",
"to_node": "Hong_Kong"
},
{
"from_node": "Honolulu",
"to_node": "fiber (Honolulu \u2192 Sydney)"
},
{
"from_node": "fiber (Honolulu \u2192 Sydney)",
"to_node": "Honolulu"
},
{
"from_node": "Honolulu",
"to_node": "fiber (Honolulu \u2192 Taipei)"
},
{
"from_node": "fiber (Honolulu \u2192 Taipei)",
"to_node": "Honolulu"
},
{
"from_node": "Mumbai",
"to_node": "fiber (Mumbai \u2192 Singapore)"
},
{
"from_node": "fiber (Mumbai \u2192 Singapore)",
"to_node": "Mumbai"
},
{
"from_node": "Seoul",
"to_node": "fiber (Seoul \u2192 Tokyo)"
},
{
"from_node": "fiber (Seoul \u2192 Tokyo)",
"to_node": "Seoul"
},
{
"from_node": "Singapore",
"to_node": "fiber (Singapore \u2192 Sydney)"
},
{
"from_node": "fiber (Singapore \u2192 Sydney)",
"to_node": "Singapore"
},
{
"from_node": "Taipei",
"to_node": "fiber (Taipei \u2192 Tokyo)"
},
{
"from_node": "fiber (Taipei \u2192 Tokyo)",
"to_node": "Taipei"
},
{
"from_node": "fiber (Bangkok \u2192 Delhi)",
"to_node": "Delhi"
},
{
"from_node": "Delhi",
"to_node": "fiber (Bangkok \u2192 Delhi)"
},
{
"from_node": "fiber (Bangkok \u2192 Hong_Kong)",
"to_node": "Hong_Kong"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Bangkok \u2192 Hong_Kong)"
},
{
"from_node": "fiber (Beijing \u2192 Seoul)",
"to_node": "Seoul"
},
{
"from_node": "Seoul",
"to_node": "fiber (Beijing \u2192 Seoul)"
},
{
"from_node": "fiber (Beijing \u2192 Shanghai)",
"to_node": "Shanghai"
},
{
"from_node": "Shanghai",
"to_node": "fiber (Beijing \u2192 Shanghai)"
},
{
"from_node": "fiber (Delhi \u2192 Mumbai)",
"to_node": "Mumbai"
},
{
"from_node": "Mumbai",
"to_node": "fiber (Delhi \u2192 Mumbai)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Shanghai)",
"to_node": "Shanghai"
},
{
"from_node": "Shanghai",
"to_node": "fiber (Hong_Kong \u2192 Shanghai)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "Sydney",
"to_node": "fiber (Hong_Kong \u2192 Sydney)"
},
{
"from_node": "fiber (Hong_Kong \u2192 Taipei)",
"to_node": "Taipei"
},
{
"from_node": "Taipei",
"to_node": "fiber (Hong_Kong \u2192 Taipei)"
},
{
"from_node": "fiber (Honolulu \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "Sydney",
"to_node": "fiber (Honolulu \u2192 Sydney)"
},
{
"from_node": "fiber (Honolulu \u2192 Taipei)",
"to_node": "Taipei"
},
{
"from_node": "Taipei",
"to_node": "fiber (Honolulu \u2192 Taipei)"
},
{
"from_node": "fiber (Mumbai \u2192 Singapore)",
"to_node": "Singapore"
},
{
"from_node": "Singapore",
"to_node": "fiber (Mumbai \u2192 Singapore)"
},
{
"from_node": "fiber (Seoul \u2192 Tokyo)",
"to_node": "Tokyo"
},
{
"from_node": "Tokyo",
"to_node": "fiber (Seoul \u2192 Tokyo)"
},
{
"from_node": "fiber (Singapore \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "Sydney",
"to_node": "fiber (Singapore \u2192 Sydney)"
},
{
"from_node": "fiber (Taipei \u2192 Tokyo)",
"to_node": "Tokyo"
},
{
"from_node": "Tokyo",
"to_node": "fiber (Taipei \u2192 Tokyo)"
}
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,582 +0,0 @@
{
"elements": [
{
"uid": "Amsterdam",
"metadata": {
"location": {
"city": "Amsterdam",
"region": "Europe",
"latitude": 52.3699996,
"longitude": 4.88999915
}
},
"type": "Transceiver"
},
{
"uid": "Berlin",
"metadata": {
"location": {
"city": "Berlin",
"region": "Europe",
"latitude": 52.520002,
"longitude": 13.379995
}
},
"type": "Transceiver"
},
{
"uid": "Brussels",
"metadata": {
"location": {
"city": "Brussels",
"region": "Europe",
"latitude": 50.830002,
"longitude": 4.330002
}
},
"type": "Transceiver"
},
{
"uid": "Bucharest",
"metadata": {
"location": {
"city": "Bucharest",
"region": "Europe",
"latitude": 44.44,
"longitude": 26.1
}
},
"type": "Transceiver"
},
{
"uid": "Frankfurt",
"metadata": {
"location": {
"city": "Frankfurt",
"region": "Europe",
"latitude": 50.1199992,
"longitude": 8.68000104
}
},
"type": "Transceiver"
},
{
"uid": "Istanbul",
"metadata": {
"location": {
"city": "Istanbul",
"region": "Europe",
"latitude": 41.1,
"longitude": 29.0
}
},
"type": "Transceiver"
},
{
"uid": "London",
"metadata": {
"location": {
"city": "London",
"region": "Europe",
"latitude": 51.5200005,
"longitude": -0.100000296
}
},
"type": "Transceiver"
},
{
"uid": "Madrid",
"metadata": {
"location": {
"city": "Madrid",
"region": "Europe",
"latitude": 40.419998,
"longitude": -3.7100002
}
},
"type": "Transceiver"
},
{
"uid": "Paris",
"metadata": {
"location": {
"city": "Paris",
"region": "Europe",
"latitude": 48.86,
"longitude": 2.3399995
}
},
"type": "Transceiver"
},
{
"uid": "Rome",
"metadata": {
"location": {
"city": "Rome",
"region": "Europe",
"latitude": 41.8899996,
"longitude": 12.5000004
}
},
"type": "Transceiver"
},
{
"uid": "Vienna",
"metadata": {
"location": {
"city": "Vienna",
"region": "Europe",
"latitude": 48.2200024,
"longitude": 16.3700005
}
},
"type": "Transceiver"
},
{
"uid": "Warsaw",
"metadata": {
"location": {
"city": "Warsaw",
"region": "Europe",
"latitude": 52.2599987,
"longitude": 21.0200005
}
},
"type": "Transceiver"
},
{
"uid": "Zurich",
"metadata": {
"location": {
"city": "Zurich",
"region": "Europe",
"latitude": 47.3800015,
"longitude": 8.5399996
}
},
"type": "Transceiver"
},
{
"uid": "fiber (Amsterdam \u2192 Berlin)",
"metadata": {
"length": 690.608,
"units": "km",
"location": {
"latitude": 52.4450008,
"longitude": 9.134997075
}
},
"type": "Fiber"
},
{
"uid": "fiber (Amsterdam \u2192 Brussels)",
"metadata": {
"length": 210.729,
"units": "km",
"location": {
"latitude": 51.600000800000004,
"longitude": 4.610000575000001
}
},
"type": "Fiber"
},
{
"uid": "fiber (Amsterdam \u2192 Frankfurt)",
"metadata": {
"length": 436.324,
"units": "km",
"location": {
"latitude": 51.2449994,
"longitude": 6.785000095000001
}
},
"type": "Fiber"
},
{
"uid": "fiber (Berlin \u2192 Warsaw)",
"metadata": {
"length": 623.015,
"units": "km",
"location": {
"latitude": 52.390000349999994,
"longitude": 17.199997749999998
}
},
"type": "Fiber"
},
{
"uid": "fiber (Brussels \u2192 London)",
"metadata": {
"length": 381.913,
"units": "km",
"location": {
"latitude": 51.17500125,
"longitude": 2.115000852
}
},
"type": "Fiber"
},
{
"uid": "fiber (Bucharest \u2192 Istanbul)",
"metadata": {
"length": 528.58,
"units": "km",
"location": {
"latitude": 42.769999999999996,
"longitude": 27.55
}
},
"type": "Fiber"
},
{
"uid": "fiber (Bucharest \u2192 Warsaw)",
"metadata": {
"length": 1136.2,
"units": "km",
"location": {
"latitude": 48.34999935,
"longitude": 23.56000025
}
},
"type": "Fiber"
},
{
"uid": "fiber (Frankfurt \u2192 Vienna)",
"metadata": {
"length": 717.001,
"units": "km",
"location": {
"latitude": 49.1700008,
"longitude": 12.52500077
}
},
"type": "Fiber"
},
{
"uid": "fiber (Istanbul \u2192 Rome)",
"metadata": {
"length": 1650.406,
"units": "km",
"location": {
"latitude": 41.4949998,
"longitude": 20.7500002
}
},
"type": "Fiber"
},
{
"uid": "fiber (London \u2192 Paris)",
"metadata": {
"length": 411.692,
"units": "km",
"location": {
"latitude": 50.19000025,
"longitude": 1.1199996019999998
}
},
"type": "Fiber"
},
{
"uid": "fiber (Madrid \u2192 Paris)",
"metadata": {
"length": 1263.619,
"units": "km",
"location": {
"latitude": 44.639999,
"longitude": -0.6850003500000001
}
},
"type": "Fiber"
},
{
"uid": "fiber (Madrid \u2192 Zurich)",
"metadata": {
"length": 1497.358,
"units": "km",
"location": {
"latitude": 43.89999975,
"longitude": 2.4149997
}
},
"type": "Fiber"
},
{
"uid": "fiber (Rome \u2192 Vienna)",
"metadata": {
"length": 920.026,
"units": "km",
"location": {
"latitude": 45.055001000000004,
"longitude": 14.43500045
}
},
"type": "Fiber"
},
{
"uid": "fiber (Rome \u2192 Zurich)",
"metadata": {
"length": 823.4,
"units": "km",
"location": {
"latitude": 44.63500055,
"longitude": 10.52
}
},
"type": "Fiber"
},
{
"uid": "fiber (Vienna \u2192 Warsaw)",
"metadata": {
"length": 669.297,
"units": "km",
"location": {
"latitude": 50.24000055,
"longitude": 18.6950005
}
},
"type": "Fiber"
}
],
"connections": [
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Berlin)"
},
{
"from_node": "fiber (Amsterdam \u2192 Berlin)",
"to_node": "Amsterdam"
},
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Brussels)"
},
{
"from_node": "fiber (Amsterdam \u2192 Brussels)",
"to_node": "Amsterdam"
},
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Frankfurt)"
},
{
"from_node": "fiber (Amsterdam \u2192 Frankfurt)",
"to_node": "Amsterdam"
},
{
"from_node": "Berlin",
"to_node": "fiber (Berlin \u2192 Warsaw)"
},
{
"from_node": "fiber (Berlin \u2192 Warsaw)",
"to_node": "Berlin"
},
{
"from_node": "Brussels",
"to_node": "fiber (Brussels \u2192 London)"
},
{
"from_node": "fiber (Brussels \u2192 London)",
"to_node": "Brussels"
},
{
"from_node": "Bucharest",
"to_node": "fiber (Bucharest \u2192 Istanbul)"
},
{
"from_node": "fiber (Bucharest \u2192 Istanbul)",
"to_node": "Bucharest"
},
{
"from_node": "Bucharest",
"to_node": "fiber (Bucharest \u2192 Warsaw)"
},
{
"from_node": "fiber (Bucharest \u2192 Warsaw)",
"to_node": "Bucharest"
},
{
"from_node": "Frankfurt",
"to_node": "fiber (Frankfurt \u2192 Vienna)"
},
{
"from_node": "fiber (Frankfurt \u2192 Vienna)",
"to_node": "Frankfurt"
},
{
"from_node": "Istanbul",
"to_node": "fiber (Istanbul \u2192 Rome)"
},
{
"from_node": "fiber (Istanbul \u2192 Rome)",
"to_node": "Istanbul"
},
{
"from_node": "London",
"to_node": "fiber (London \u2192 Paris)"
},
{
"from_node": "fiber (London \u2192 Paris)",
"to_node": "London"
},
{
"from_node": "Madrid",
"to_node": "fiber (Madrid \u2192 Paris)"
},
{
"from_node": "fiber (Madrid \u2192 Paris)",
"to_node": "Madrid"
},
{
"from_node": "Madrid",
"to_node": "fiber (Madrid \u2192 Zurich)"
},
{
"from_node": "fiber (Madrid \u2192 Zurich)",
"to_node": "Madrid"
},
{
"from_node": "Rome",
"to_node": "fiber (Rome \u2192 Vienna)"
},
{
"from_node": "fiber (Rome \u2192 Vienna)",
"to_node": "Rome"
},
{
"from_node": "Rome",
"to_node": "fiber (Rome \u2192 Zurich)"
},
{
"from_node": "fiber (Rome \u2192 Zurich)",
"to_node": "Rome"
},
{
"from_node": "Vienna",
"to_node": "fiber (Vienna \u2192 Warsaw)"
},
{
"from_node": "fiber (Vienna \u2192 Warsaw)",
"to_node": "Vienna"
},
{
"from_node": "fiber (Amsterdam \u2192 Berlin)",
"to_node": "Berlin"
},
{
"from_node": "Berlin",
"to_node": "fiber (Amsterdam \u2192 Berlin)"
},
{
"from_node": "fiber (Amsterdam \u2192 Brussels)",
"to_node": "Brussels"
},
{
"from_node": "Brussels",
"to_node": "fiber (Amsterdam \u2192 Brussels)"
},
{
"from_node": "fiber (Amsterdam \u2192 Frankfurt)",
"to_node": "Frankfurt"
},
{
"from_node": "Frankfurt",
"to_node": "fiber (Amsterdam \u2192 Frankfurt)"
},
{
"from_node": "fiber (Berlin \u2192 Warsaw)",
"to_node": "Warsaw"
},
{
"from_node": "Warsaw",
"to_node": "fiber (Berlin \u2192 Warsaw)"
},
{
"from_node": "fiber (Brussels \u2192 London)",
"to_node": "London"
},
{
"from_node": "London",
"to_node": "fiber (Brussels \u2192 London)"
},
{
"from_node": "fiber (Bucharest \u2192 Istanbul)",
"to_node": "Istanbul"
},
{
"from_node": "Istanbul",
"to_node": "fiber (Bucharest \u2192 Istanbul)"
},
{
"from_node": "fiber (Bucharest \u2192 Warsaw)",
"to_node": "Warsaw"
},
{
"from_node": "Warsaw",
"to_node": "fiber (Bucharest \u2192 Warsaw)"
},
{
"from_node": "fiber (Frankfurt \u2192 Vienna)",
"to_node": "Vienna"
},
{
"from_node": "Vienna",
"to_node": "fiber (Frankfurt \u2192 Vienna)"
},
{
"from_node": "fiber (Istanbul \u2192 Rome)",
"to_node": "Rome"
},
{
"from_node": "Rome",
"to_node": "fiber (Istanbul \u2192 Rome)"
},
{
"from_node": "fiber (London \u2192 Paris)",
"to_node": "Paris"
},
{
"from_node": "Paris",
"to_node": "fiber (London \u2192 Paris)"
},
{
"from_node": "fiber (Madrid \u2192 Paris)",
"to_node": "Paris"
},
{
"from_node": "Paris",
"to_node": "fiber (Madrid \u2192 Paris)"
},
{
"from_node": "fiber (Madrid \u2192 Zurich)",
"to_node": "Zurich"
},
{
"from_node": "Zurich",
"to_node": "fiber (Madrid \u2192 Zurich)"
},
{
"from_node": "fiber (Rome \u2192 Vienna)",
"to_node": "Vienna"
},
{
"from_node": "Vienna",
"to_node": "fiber (Rome \u2192 Vienna)"
},
{
"from_node": "fiber (Rome \u2192 Zurich)",
"to_node": "Zurich"
},
{
"from_node": "Zurich",
"to_node": "fiber (Rome \u2192 Zurich)"
},
{
"from_node": "fiber (Vienna \u2192 Warsaw)",
"to_node": "Warsaw"
},
{
"from_node": "Warsaw",
"to_node": "fiber (Vienna \u2192 Warsaw)"
}
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +0,0 @@
-1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01
-2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02
-1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01
-2.0500000000000000e+01 -2.0489473680000000e+01 -2.0478947370000000e+01 -2.0468421050000000e+01 -2.0457894740000000e+01 -2.0447368420000000e+01 -2.0436842110000001e+01 -2.0426315790000000e+01 -2.0415789470000000e+01 -2.0405263160000001e+01 -2.0394736840000000e+01 -2.0384210530000001e+01 -2.0373684210000000e+01 -2.0363157890000000e+01 -2.0352631580000001e+01 -2.0342105260000000e+01 -2.0331578950000001e+01 -2.0321052630000001e+01 -2.0310526320000001e+01 -2.0300000000000001e+01 -2.0289473680000000e+01 -2.0278947370000001e+01 -2.0268421050000001e+01 -2.0257894740000001e+01 -2.0247368420000001e+01 -2.0236842110000001e+01 -2.0226315790000001e+01 -2.0215789470000001e+01 -2.0205263160000001e+01 -2.0194736840000001e+01 -2.0184210530000001e+01 -2.0173684210000001e+01 -2.0163157890000001e+01 -2.0152631580000001e+01 -2.0142105260000001e+01 -2.0131578950000002e+01 -2.0121052630000001e+01 -2.0110526320000002e+01 -2.0100000000000001e+01 -2.0089473680000001e+01 -2.0078947370000002e+01 -2.0068421050000001e+01 -2.0057894739999998e+01 -2.0047368420000002e+01 -2.0036842109999998e+01 -2.0026315790000002e+01 -2.0015789470000001e+01 -2.0005263159999998e+01 -1.9994736840000002e+01 -1.9984210529999999e+01 -1.9973684209999998e+01 -1.9963157890000002e+01 -1.9952631579999998e+01 -1.9942105260000002e+01 -1.9931578949999999e+01 -1.9921052629999998e+01 -1.9910526319999999e+01 -1.9899999999999999e+01 -1.9889473679999998e+01 -1.9878947369999999e+01 -1.9868421049999998e+01 -1.9857894739999999e+01 -1.9847368419999999e+01 -1.9836842109999999e+01 -1.9826315789999999e+01 -1.9815789469999999e+01 -1.9805263159999999e+01 -1.9794736839999999e+01 -1.9784210529999999e+01 -1.9773684209999999e+01 -1.9763157889999999e+01 -1.9752631579999999e+01 -1.9742105259999999e+01 -1.9731578949999999e+01 -1.9721052629999999e+01 -1.9710526320000000e+01 -1.9699999999999999e+01 -1.9689473679999999e+01 -1.9678947369999999e+01 -1.9668421049999999e+01 -1.9657894740000000e+01 -1.9647368419999999e+01 -1.9636842110000000e+01 -1.9626315790000000e+01 -1.9615789469999999e+01 -1.9605263160000000e+01 -1.9594736839999999e+01 -1.9584210530000000e+01 -1.9573684210000000e+01 -1.9563157889999999e+01 -1.9552631580000000e+01 -1.9542105260000000e+01 -1.9531578950000000e+01 -1.9521052630000000e+01 -1.9510526320000000e+01 -1.9500000000000000e+01
-2.0500000000000000e+01 -2.0489473680000000e+01 -2.0478947370000000e+01 -2.0468421050000000e+01 -2.0457894740000000e+01 -2.0447368420000000e+01 -2.0436842110000001e+01 -2.0426315790000000e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.9573684210000000e+01 -1.9563157889999999e+01 -1.9552631580000000e+01 -1.9542105260000000e+01 -1.9531578950000000e+01 -1.9521052630000000e+01 -1.9510526320000000e+01 -1.9500000000000000e+01
-1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01
-1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01
-1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02

View File

@@ -1,8 +0,0 @@
7.0000000000000000e+01 1.1700000000000000e+02 1.0800000000000000e+02 1.0800000000000000e+02 3.2000000000000000e+01 7.0000000000000000e+01 1.0800000000000000e+02 9.7000000000000000e+01 1.1600000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
7.9000000000000000e+01 1.1000000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 7.1000000000000000e+01 1.1400000000000000e+02 1.1100000000000000e+02 1.1700000000000000e+02 1.1200000000000000e+02 3.2000000000000000e+01 6.6000000000000000e+01 1.0800000000000000e+02 1.1700000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01
7.9000000000000000e+01 1.1000000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 7.1000000000000000e+01 1.1400000000000000e+02 1.1100000000000000e+02 1.1700000000000000e+02 1.1200000000000000e+02 3.2000000000000000e+01 8.2000000000000000e+01 1.0100000000000000e+02 1.0000000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01
7.0000000000000000e+01 1.1700000000000000e+02 1.0800000000000000e+02 1.0800000000000000e+02 3.2000000000000000e+01 1.1900000000000000e+02 3.2000000000000000e+01 8.3000000000000000e+01 8.2000000000000000e+01 8.3000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
6.6000000000000000e+01 1.1100000000000000e+02 1.1600000000000000e+02 1.0400000000000000e+02 3.2000000000000000e+01 6.9000000000000000e+01 1.1000000000000000e+02 1.0000000000000000e+02 1.1500000000000000e+02 3.2000000000000000e+01 1.1900000000000000e+02 3.2000000000000000e+01 8.3000000000000000e+01 8.2000000000000000e+01 8.3000000000000000e+01
1.0400000000000000e+02 1.0100000000000000e+02 9.7000000000000000e+01 1.1800000000000000e+02 1.2100000000000000e+02 3.2000000000000000e+01 9.8000000000000000e+01 1.0800000000000000e+02 1.1700000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
1.0400000000000000e+02 1.0100000000000000e+02 9.7000000000000000e+01 1.1800000000000000e+02 1.2100000000000000e+02 3.2000000000000000e+01 1.1400000000000000e+02 1.0100000000000000e+02 1.0000000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
1.1900000000000000e+02 1.1100000000000000e+02 1.1400000000000000e+02 1.1500000000000000e+02 1.1600000000000000e+02 3.2000000000000000e+01 9.9000000000000000e+01 9.7000000000000000e+01 1.1500000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01

View File

@@ -1,9 +0,0 @@
{
"params": {
"dfg": [25.13596985, 25.11822814, 25.09542133, 25.06245771, 25.02602765, 24.99637953, 24.98167255, 24.97530668, 24.98320726, 24.99718565, 25.01757247, 25.03832781, 25.05495585, 25.0670719, 25.07091411, 25.07094365, 25.07114324, 25.07533627, 25.08731018, 25.10313936, 25.12276204, 25.14239479, 25.15945633, 25.17392704, 25.17673767, 25.17037141, 25.15216254, 25.1311431, 25.10802335, 25.08548777, 25.06916675, 25.05848176, 25.05447313, 25.05154441, 25.04946059, 25.04717849, 25.04551656, 25.04467649, 25.0407292, 25.03285408, 25.0234883, 25.01659234, 25.01332136, 25.01123434, 25.01030015, 25.00936548, 25.00873964, 25.00842535, 25.00696466, 25.0040431, 25.00070998, 24.9984232, 24.99306332, 24.98352421, 24.97125103, 24.96038108, 24.94888721, 24.93531489, 24.92131927, 24.90898697, 24.89896514, 24.88958463, 24.8808387, 24.87210092, 24.86462026, 24.85839773, 24.85445838, 24.85155443, 24.85176601, 24.85408014, 24.85909624, 24.86474458, 24.87203486, 24.8803652, 24.88910669, 24.89721313, 24.90282604, 24.9065669, 24.9086508, 24.91093944, 24.91343079, 24.91592344, 24.92155351, 24.93031861, 24.94052812, 24.94904669, 24.95757123, 24.96781845, 24.98180093, 24.99782686, 25.01393183, 25.02809846, 25.04032575, 25.05256981, 25.06479701, 25.07704697],
"dgt": [2.714526681131686, 2.705443819238505, 2.6947834587664494, 2.6841217449620203, 2.6681935771243177, 2.6521732021128046, 2.630396440815385, 2.602860350286428, 2.5696460593920065, 2.5364027376452056, 2.499446286796604, 2.4587748041127506, 2.414398437185221, 2.3699990328716107, 2.322373696229342, 2.271520771371253, 2.2174389328192197, 2.16337565384239, 2.1183028432496016, 2.082225099873648, 2.055100772005235, 2.0279625371819305, 2.0008103857988204, 1.9736443063300082, 1.9482128147680253, 1.9245345552113182, 1.9026104247588487, 1.8806927939516411, 1.862235672444246, 1.847275503201129, 1.835814081380705, 1.824381436842932, 1.8139629377087627, 1.8045606557581335, 1.7961751115773796, 1.7877868031023945, 1.7793941781790852, 1.7709972329654864, 1.7625959636196327, 1.7541903672600494, 1.7459181197626403, 1.737780757913635, 1.7297783508684146, 1.7217732861435076, 1.7137640932265894, 1.7057507692361864, 1.6918150918099673, 1.6719047669939942, 1.6460167077689267, 1.6201194134191075, 1.5986915141218316, 1.5817353179379183, 1.569199764184379, 1.5566577309558969, 1.545374152761467, 1.5353620432989845, 1.5266220576235803, 1.5178910621476225, 1.5097346239790443, 1.502153039909686, 1.495145456062699, 1.488134243479226, 1.48111939735681, 1.474100442252211, 1.4670307626366115, 1.4599103316162523, 1.45273959485914, 1.445565137158368, 1.4340878115214444, 1.418273806730323, 1.3981208704326855, 1.3779439775587023, 1.3598972673004606, 1.3439818461440451, 1.3301807335621048, 1.316383926863083, 1.3040618749785347, 1.2932153453410835, 1.2838336236692311, 1.2744470198196236, 1.2650555289898042, 1.2556591482982988, 1.2428104897182262, 1.2264996957264114, 1.2067249615595257, 1.1869318618366975, 1.1672278304018044, 1.1476135933863398, 1.1280891949729075, 1.108555289615659, 1.0895983485572227, 1.0712204022764056, 1.0534217504465226, 1.0356155337864215, 1.017807767853702, 1.0],
"nf_fit_coeff": [0.000168241, 0.0469961, 0.0359549, 5.82851],
"nf_ripple": [-0.315374332, -0.315374332, -0.3154009157100272, -0.3184914611751095, -0.32158358425400546, -0.3246772861549999, -0.32762368641496226, -0.3205413846123276, -0.31345546385118733, -0.3063659213569748, -0.29920267890990127, -0.27061972852631744, -0.24202215770774693, -0.21340995523361256, -0.18478227130158695, -0.14809761118389625, -0.11139416731807622, -0.07467192527357988, -0.038026748965679924, -0.019958469399422092, -0.0018809287980157928, 0.01620587996057356, 0.03430196400570967, 0.05240733047405406, 0.07052198650959736, 0.079578036683472, 0.08854664736190952, 0.0975198632319653, 0.10649768784154924, 0.0977413804499074, 0.08880343717266004, 0.07986089973284587, 0.0709137645874038, 0.06333589274056531, 0.055756212252058776, 0.04817263174786321, 0.04058514821716236, 0.03338159167571013, 0.026178308595650738, 0.018971315351761126, 0.011760609076833628, 0.01695029492275999, 0.02227499135770144, 0.02760243318910433, 0.03293262254079026, 0.038265561538776145, 0.04360125231127117, 0.03485699074348155, 0.025991055149117932, 0.017120541224980364, 0.008275758735920322, 0.0019423214065246042, -0.004394389017104359, -0.010734375072893196, -0.017077639301414434, -0.02467970289957285, -0.03229797040382168, -0.03992018009047725, -0.04753456632753024, -0.049234003141433724, -0.05093432003654719, -0.05263551769669225, -0.05433759680640246, -0.0560405580509193, -0.057718452237076875, -0.056840590379175944, -0.055962273198734966, -0.05508350034141658, -0.054204271452516814, -0.05839608872695511, -0.06262733016971533, -0.0668607690892037, -0.07090173625606945, -0.05209609730905224, -0.03328068412141294, -0.014455489070928059, 0.004315038757905716, 0.014839202394482527, 0.025368841662503576, 0.03590396083646565, 0.0464445641953214, 0.05699065602246746, 0.06754224060577406, 0.10002709623672751, 0.13258013095133617, 0.1651501336277331, 0.1977371175359939, 0.23194802687829724, 0.26618779883837107, 0.3004454365808535, 0.33472095409250663, 0.35929034770587287, 0.38384389188855605, 0.40841026111391787, 0.43298946543290784, 0.43298946543290784],
"frequencies": []
}
}

View File

@@ -1,72 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import matplotlib.pyplot as plt
import numpy as np
from gnpy.core.utils import (load_json,
itufs,
freq2wavelength,
lin2db,
db2lin)
from gnpy.core import network
topology = load_json('edfa_example_network.json')
nw = network.network_from_json(topology)
pch2d_legend_data = np.loadtxt('Pchan2DLegend.txt')
pch2d = np.loadtxt('Pchan2D.txt')
ch_spacing = 0.05
fc = itufs(ch_spacing)
lc = freq2wavelength(fc) / 1000
nchan = np.arange(len(lc))
df = np.ones(len(lc)) * ch_spacing
edfa1 = [n for n in nw.nodes() if n.uid == 'Edfa1'][0]
edfa1.gain_target = 20.0
edfa1.tilt_target = -0.7
edfa1.calc_nf()
results = []
for Pin in pch2d:
chgain = edfa1.gain_profile(Pin)
pase = edfa1.noise_profile(chgain, fc, df)
pout = lin2db(db2lin(Pin + chgain) + db2lin(pase))
results.append(pout)
# Generate legend text
pch2d_legend = []
for ea in pch2d_legend_data:
s = ''.join([chr(xx) for xx in ea.astype(dtype=int)]).strip()
pch2d_legend.append(s)
# Plot
axis_font = {'fontname': 'Arial', 'size': '16', 'fontweight': 'bold'}
title_font = {'fontname': 'Arial', 'size': '17', 'fontweight': 'bold'}
tic_font = {'fontname': 'Arial', 'size': '12'}
plt.rcParams["font.family"] = "Arial"
plt.figure()
plt.plot(nchan, pch2d.T, '.-', lw=2)
plt.xlabel('Channel Number', **axis_font)
plt.ylabel('Channel Power [dBm]', **axis_font)
plt.title('Input Power Profiles for Different Channel Loading', **title_font)
plt.legend(pch2d_legend, loc=5)
plt.grid()
plt.ylim((-100, -10))
plt.xlim((0, 110))
plt.xticks(np.arange(0, 100, 10), **tic_font)
plt.yticks(np.arange(-110, -10, 10), **tic_font)
plt.figure()
for result in results:
plt.plot(nchan, result, '.-', lw=2)
plt.title('Output Power w/ ASE for Different Channel Loading', **title_font)
plt.xlabel('Channel Number', **axis_font)
plt.ylabel('Channel Power [dBm]', **axis_font)
plt.grid()
plt.ylim((-50, 10))
plt.xlim((0, 100))
plt.xticks(np.arange(0, 100, 10), **tic_font)
plt.yticks(np.arange(-50, 10, 10), **tic_font)
plt.legend(pch2d_legend, loc=5)
plt.show()

View File

@@ -1,15 +0,0 @@
{
"gain_flatmax": 25,
"gain_min": 15,
"p_max": 21,
"nf_fit_coeff": "pNFfit3.txt",
"nf_ripple": "NFR_96.txt",
"dfg": "DFG_96.txt",
"dgt": "DGT_96.txt",
"nf_model":
{
"enabled": true,
"nf_min": 5.8,
"nf_max": 10
}
}

View File

@@ -1,8 +0,0 @@
-1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01
-2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02
-1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01 -2.0000000000000000e+01
-2.0500000000000000e+01 -2.0489473680000000e+01 -2.0478947370000000e+01 -2.0468421050000000e+01 -2.0457894740000000e+01 -2.0447368420000000e+01 -2.0436842110000001e+01 -2.0426315790000000e+01 -2.0415789470000000e+01 -2.0405263160000001e+01 -2.0394736840000000e+01 -2.0384210530000001e+01 -2.0373684210000000e+01 -2.0363157890000000e+01 -2.0352631580000001e+01 -2.0342105260000000e+01 -2.0331578950000001e+01 -2.0321052630000001e+01 -2.0310526320000001e+01 -2.0300000000000001e+01 -2.0289473680000000e+01 -2.0278947370000001e+01 -2.0268421050000001e+01 -2.0257894740000001e+01 -2.0247368420000001e+01 -2.0236842110000001e+01 -2.0226315790000001e+01 -2.0215789470000001e+01 -2.0205263160000001e+01 -2.0194736840000001e+01 -2.0184210530000001e+01 -2.0173684210000001e+01 -2.0163157890000001e+01 -2.0152631580000001e+01 -2.0142105260000001e+01 -2.0131578950000002e+01 -2.0121052630000001e+01 -2.0110526320000002e+01 -2.0100000000000001e+01 -2.0089473680000001e+01 -2.0078947370000002e+01 -2.0068421050000001e+01 -2.0057894739999998e+01 -2.0047368420000002e+01 -2.0036842109999998e+01 -2.0026315790000002e+01 -2.0015789470000001e+01 -2.0005263159999998e+01 -1.9994736840000002e+01 -1.9984210529999999e+01 -1.9973684209999998e+01 -1.9963157890000002e+01 -1.9952631579999998e+01 -1.9942105260000002e+01 -1.9931578949999999e+01 -1.9921052629999998e+01 -1.9910526319999999e+01 -1.9899999999999999e+01 -1.9889473679999998e+01 -1.9878947369999999e+01 -1.9868421049999998e+01 -1.9857894739999999e+01 -1.9847368419999999e+01 -1.9836842109999999e+01 -1.9826315789999999e+01 -1.9815789469999999e+01 -1.9805263159999999e+01 -1.9794736839999999e+01 -1.9784210529999999e+01 -1.9773684209999999e+01 -1.9763157889999999e+01 -1.9752631579999999e+01 -1.9742105259999999e+01 -1.9731578949999999e+01 -1.9721052629999999e+01 -1.9710526320000000e+01 -1.9699999999999999e+01 -1.9689473679999999e+01 -1.9678947369999999e+01 -1.9668421049999999e+01 -1.9657894740000000e+01 -1.9647368419999999e+01 -1.9636842110000000e+01 -1.9626315790000000e+01 -1.9615789469999999e+01 -1.9605263160000000e+01 -1.9594736839999999e+01 -1.9584210530000000e+01 -1.9573684210000000e+01 -1.9563157889999999e+01 -1.9552631580000000e+01 -1.9542105260000000e+01 -1.9531578950000000e+01 -1.9521052630000000e+01 -1.9510526320000000e+01 -1.9500000000000000e+01
-2.0500000000000000e+01 -2.0489473680000000e+01 -2.0478947370000000e+01 -2.0468421050000000e+01 -2.0457894740000000e+01 -2.0447368420000000e+01 -2.0436842110000001e+01 -2.0426315790000000e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.9573684210000000e+01 -1.9563157889999999e+01 -1.9552631580000000e+01 -1.9542105260000000e+01 -1.9531578950000000e+01 -1.9521052630000000e+01 -1.9510526320000000e+01 -1.9500000000000000e+01
-1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01
-1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01 -1.4460000000000001e+01
-1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.4460000000000001e+01 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02 -1.0000000000000000e+02

View File

@@ -1,8 +0,0 @@
7.0000000000000000e+01 1.1700000000000000e+02 1.0800000000000000e+02 1.0800000000000000e+02 3.2000000000000000e+01 7.0000000000000000e+01 1.0800000000000000e+02 9.7000000000000000e+01 1.1600000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
7.9000000000000000e+01 1.1000000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 7.1000000000000000e+01 1.1400000000000000e+02 1.1100000000000000e+02 1.1700000000000000e+02 1.1200000000000000e+02 3.2000000000000000e+01 6.6000000000000000e+01 1.0800000000000000e+02 1.1700000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01
7.9000000000000000e+01 1.1000000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 7.1000000000000000e+01 1.1400000000000000e+02 1.1100000000000000e+02 1.1700000000000000e+02 1.1200000000000000e+02 3.2000000000000000e+01 8.2000000000000000e+01 1.0100000000000000e+02 1.0000000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01
7.0000000000000000e+01 1.1700000000000000e+02 1.0800000000000000e+02 1.0800000000000000e+02 3.2000000000000000e+01 1.1900000000000000e+02 3.2000000000000000e+01 8.3000000000000000e+01 8.2000000000000000e+01 8.3000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
6.6000000000000000e+01 1.1100000000000000e+02 1.1600000000000000e+02 1.0400000000000000e+02 3.2000000000000000e+01 6.9000000000000000e+01 1.1000000000000000e+02 1.0000000000000000e+02 1.1500000000000000e+02 3.2000000000000000e+01 1.1900000000000000e+02 3.2000000000000000e+01 8.3000000000000000e+01 8.2000000000000000e+01 8.3000000000000000e+01
1.0400000000000000e+02 1.0100000000000000e+02 9.7000000000000000e+01 1.1800000000000000e+02 1.2100000000000000e+02 3.2000000000000000e+01 9.8000000000000000e+01 1.0800000000000000e+02 1.1700000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
1.0400000000000000e+02 1.0100000000000000e+02 9.7000000000000000e+01 1.1800000000000000e+02 1.2100000000000000e+02 3.2000000000000000e+01 1.1400000000000000e+02 1.0100000000000000e+02 1.0000000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01
1.1900000000000000e+02 1.1100000000000000e+02 1.1400000000000000e+02 1.1500000000000000e+02 1.1600000000000000e+02 3.2000000000000000e+01 9.9000000000000000e+01 9.7000000000000000e+01 1.1500000000000000e+02 1.0100000000000000e+02 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01 3.2000000000000000e+01

View File

@@ -1,301 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Mon Nov 27 12:32:04 2017
@author: briantaylor
"""
import numpy as np
from numpy import polyfit, polyval, mean
from utilities import lin2db, db2lin, itufs, freq2wavelength
import matplotlib.pyplot as plt
from scipy.constants import h
def noise_profile(nf, gain, ffs, df):
""" noise_profile(nf, gain, ffs, df) computes amplifier ase
:param nf: Noise figure in dB
:param gain: Actual gain calculated for the EDFA in dB units
:param ffs: A numpy array of frequencies
:param df: the reference bw in THz
:type nf: numpy.ndarray
:type gain: numpy.ndarray
:type ffs: numpy.ndarray
:type df: float
:return: the asepower in dBm
:rtype: numpy.ndarray
ASE POWER USING PER CHANNEL GAIN PROFILE
INPUTS:
NF_dB - Noise figure in dB, vector of length number of channels or
spectral slices
G_dB - Actual gain calculated for the EDFA, vector of length number of
channels or spectral slices
ffs - Center frequency grid of the channels or spectral slices in THz,
vector of length number of channels or spectral slices
dF - width of each channel or spectral slice in THz,
vector of length number of channels or spectral slices
OUTPUT:
ase_dBm - ase in dBm per channel or spectral slice
NOTE: the output is the total ASE in the channel or spectral slice. For
50GHz channels the ASE BW is effectively 0.4nm. To get to noise power in
0.1nm, subtract 6dB.
ONSR is usually quoted as channel power divided by
the ASE power in 0.1nm RBW, regardless of the width of the actual
channel. This is a historical convention from the days when optical
signals were much smaller (155Mbps, 2.5Gbps, ... 10Gbps) than the
resolution of the OSAs that were used to measure spectral power which
were set to 0.1nm resolution for convenience. Moving forward into
flexible grid and high baud rate signals, it may be convenient to begin
quoting power spectral density in the same BW for both signal and ASE,
e.g. 12.5GHz."""
h_mWThz = 1e-3 * h * (1e14)**2
nf_lin = db2lin(nf)
g_lin = db2lin(gain)
ase = h_mWThz * df * ffs * (nf_lin * g_lin - 1)
asedb = lin2db(ase)
return asedb
def gain_profile(dfg, dgt, Pin, gp, gtp):
"""
:param dfg: design flat gain
:param dgt: design gain tilt
:param Pin: channing input power profile
:param gp: Average gain setpoint in dB units
:param gtp: gain tilt setting
:type dfg: numpy.ndarray
:type dgt: numpy.ndarray
:type Pin: numpy.ndarray
:type gp: float
:type gtp: float
:return: gain profile in dBm
:rtype: numpy.ndarray
AMPLIFICATION USING INPUT PROFILE
INPUTS:
DFG - vector of length number of channels or spectral slices
DGT - vector of length number of channels or spectral slices
Pin - input powers vector of length number of channels or
spectral slices
Gp - provisioned gain length 1
GTp - provisioned tilt length 1
OUTPUT:
amp gain per channel or spectral slice
NOTE: there is no checking done for violations of the total output power
capability of the amp.
Ported from Matlab version written by David Boerges at Ciena.
Based on:
R. di Muro, "The Er3+ fiber gain coefficient derived from a dynamic
gain
tilt technique", Journal of Lightwave Technology, Vol. 18, Iss. 3,
Pp. 343-347, 2000.
"""
err_tolerance = 1.0e-11
simple_opt = True
# TODO make all values linear unit and convert to dB units as needed within
# this function.
nchan = list(range(len(Pin)))
# TODO find a way to use these or lose them. Primarily we should have a
# way to determine if exceeding the gain or output power of the amp
tot_in_power_db = lin2db(np.sum(db2lin(Pin)))
avg_gain_db = lin2db(mean(db2lin(dfg)))
# Linear fit to get the
p = polyfit(nchan, dgt, 1)
dgt_slope = p[0]
# Calculate the target slope- Currently assumes equal spaced channels
# TODO make it so that supports arbitrary channel spacing.
targ_slope = gtp / (len(nchan) - 1)
# 1st estimate of DGT scaling
dgts1 = targ_slope / dgt_slope
# when simple_opt is true code makes 2 attempts to compute gain and
# the internal voa value. This is currently here to provide direct
# comparison with original Matlab code. Will be removed.
# TODO replace with loop
if simple_opt:
# 1st estimate of Er gain & voa loss
g1st = dfg + dgt * dgts1
voa = lin2db(mean(db2lin(g1st))) - gp
# 2nd estimate of Amp ch gain using the channel input profile
g2nd = g1st - voa
pout_db = lin2db(np.sum(db2lin(Pin + g2nd)))
dgts2 = gp - (pout_db - tot_in_power_db)
# Center estimate of amp ch gain
xcent = dgts2
gcent = g1st - voa + dgt * xcent
pout_db = lin2db(np.sum(db2lin(Pin + gcent)))
gavg_cent = pout_db - tot_in_power_db
# Lower estimate of Amp ch gain
deltax = np.max(g1st) - np.min(g1st)
xlow = dgts2 - deltax
glow = g1st - voa + xlow * dgt
pout_db = lin2db(np.sum(db2lin(Pin + glow)))
gavg_low = pout_db - tot_in_power_db
# Upper gain estimate
xhigh = dgts2 + deltax
ghigh = g1st - voa + xhigh * dgt
pout_db = lin2db(np.sum(db2lin(Pin + ghigh)))
gavg_high = pout_db - tot_in_power_db
# compute slope
slope1 = (gavg_low - gavg_cent) / (xlow - xcent)
slope2 = (gavg_cent - gavg_high) / (xcent - xhigh)
if np.abs(gp - gavg_cent) <= err_tolerance:
dgts3 = xcent
elif gp < gavg_cent:
dgts3 = xcent - (gavg_cent - gp) / slope1
else:
dgts3 = xcent + (-gavg_cent + gp) / slope2
gprofile = g1st - voa + dgt * dgts3
else:
gprofile = None
return gprofile
if __name__ == '__main__':
plt.close('all')
fc = itufs(0.05)
lc = freq2wavelength(fc) / 1000
nchan = list(range(len(lc)))
df = np.array([0.05] * (nchan[-1] + 1))
# TODO remove path dependence
path = ''
"""
DFG_96: Design flat gain at each wavelength in the 96 channel 50GHz ITU
grid in dB. This can be experimentally determined by measuring the gain
at each wavelength using a full, flat channel (or ASE) load at the input.
The amplifier should be set to its maximum flat gain (tilt = 0dB). This
measurement captures the ripple of the amplifier. If the amplifier was
designed to be mimimum ripple at some other tilt value, then the ripple
reflected in this measurement will not be that minimum. However, when
the DGT gets applied through the provisioning of tilt, the model should
accurately reproduce the expected ripple at that tilt value. One could
also do the measurement at some expected tilt value and back-calculate
this vector using the DGT method. Alternatively, one could re-write the
algorithm to accept a nominal tilt and a tiled version of this vector.
"""
dfg_96 = np.loadtxt(path + 'DFG_96.txt')
"""maximum gain for flat operation - the amp in the data file was designed
for 25dB gain and has an internal VOA for setting the external gain
"""
avg_dfg = dfg_96.mean()
"""
DGT_96: This is the so-called Dynamic Gain Tilt of the EDFA in dB/dB. It
is the change in gain at each wavelength corresponding to a 1dB change at
the longest wavelength supported. The value can be obtained
experimentally or through analysis of the cross sections or Giles
parameters of the Er fibre. This is experimentally measured by changing
the gain of the amplifier above the maximum flat gain while not changing
the internal VOA (i.e. the mid-stage VOA is set to minimum and does not
change during the measurement). Note that the measurement can change the
gain by an arbitrary amount and divide by the gain change (in dB) which
is measured at the reference wavelength (the red end of the band).
"""
dgt_96 = np.loadtxt(path + 'DGT_96.txt')
"""
pNFfit3: Cubic polynomial fit coefficients to noise figure in dB
averaged across wavelength as a function of gain change from design flat:
NFavg = pNFfit3(1)*dG^3 + pNFfit3(2)*dG^2 pNFfit3(3)*dG + pNFfit3(4)
where
dG = GainTarget - average(DFG_96)
note that dG will normally be a negative value.
"""
nf_fitco = np.loadtxt(path + 'pNFfit3.txt')
"""NFR_96: Noise figure ripple in dB away from the average noise figure
across the band. This captures the wavelength dependence of the NF. To
calculate the NF across channels, one uses the cubic fit coefficients
with the external gain target to get the average nosie figure, NFavg and
then adds this to NFR_96:
NF_96 = NFR_96 + NFavg
"""
nf_ripple = np.loadtxt(path + 'NFR_96.txt')
# This is an example to set the provisionable gain and gain-tilt values
# Tilt is in units of dB/THz
gain_target = 20.0
tilt_target = -0.7
# calculate the NF for the EDFA at this gain setting
dg = gain_target - avg_dfg
nf_avg = polyval(nf_fitco, dg)
nf_96 = nf_ripple + nf_avg
# get the input power profiles to show
pch2d = np.loadtxt(path + 'Pchan2D.txt')
# Load legend and assemble legend text
pch2d_legend_data = np.loadtxt(path + 'Pchan2DLegend.txt')
pch2d_legend = []
for ea in pch2d_legend_data:
s = ''.join([chr(xx) for xx in ea.astype(dtype=int)]).strip()
pch2d_legend.append(s)
# assemble plot
axis_font = {'fontname': 'Arial', 'size': '16', 'fontweight': 'bold'}
title_font = {'fontname': 'Arial', 'size': '17', 'fontweight': 'bold'}
tic_font = {'fontname': 'Arial', 'size': '12'}
plt.rcParams["font.family"] = "Arial"
plt.figure()
plt.plot(nchan, pch2d.T, '.-', lw=2)
plt.xlabel('Channel Number', **axis_font)
plt.ylabel('Channel Power [dBm]', **axis_font)
plt.title('Input Power Profiles for Different Channel Loading',
**title_font)
plt.legend(pch2d_legend, loc=5)
plt.grid()
plt.ylim((-100, -10))
plt.xlim((0, 110))
plt.xticks(np.arange(0, 100, 10), **tic_font)
plt.yticks(np.arange(-110, -10, 10), **tic_font)
plt.figure()
ea = pch2d[1, :]
for ea in pch2d:
chgain = gain_profile(dfg_96, dgt_96, ea, gain_target, tilt_target)
pase = noise_profile(nf_96, chgain, fc, df)
pout = lin2db(db2lin(ea + chgain) + db2lin(pase))
plt.plot(nchan, pout, '.-', lw=2)
plt.title('Output Power with ASE for Different Channel Loading',
**title_font)
plt.xlabel('Channel Number', **axis_font)
plt.ylabel('Channel Power [dBm]', **axis_font)
plt.grid()
plt.ylim((-50, 10))
plt.xlim((0, 100))
plt.xticks(np.arange(0, 100, 10), **tic_font)
plt.yticks(np.arange(-50, 10, 10), **tic_font)
plt.legend(pch2d_legend, loc=5)
plt.show()

View File

@@ -1,164 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Jan 30 12:32:00 2018
@author: jeanluc-auge
@comments about amplifier input files from Brian Taylor & Dave Boertjes
update an existing json file with all the 96ch txt files for a given amplifier type
amplifier type 'OA_type1' is hard coded but can be modified and other types added
returns an updated amplifier json file: output_json_file_name = 'edfa_config.json'
"""
import re
import sys
import json
import numpy as np
from gnpy.core.utils import lin2db, db2lin
"""amplifier file names
convert a set of amplifier files + input json definiton file into a valid edfa_json_file:
nf_fit_coeff: NF polynomial coefficients txt file (optional)
nf_ripple: NF ripple excursion txt file
dfg: gain txt file
dgt: dynamic gain txt file
input json file in argument (defult = 'OA.json')
the json input file should have the following fields:
{
"gain_flatmax": 25,
"gain_min": 15,
"p_max": 21,
"nf_fit_coeff": "pNFfit3.txt",
"nf_ripple": "NFR_96.txt",
"dfg": "DFG_96.txt",
"dgt": "DGT_96.txt",
"nf_model":
{
"enabled": true,
"nf_min": 5.8,
"nf_max": 10
}
}
gain_flat = max flat gain (dB)
gain_min = min gain (dB) : will consider an input VOA if below (TBD vs throwing an exception)
p_max = max power (dBm)
nf_fit = boolean (True, False) :
if False nf_fit_coeff are ignored and nf_model fields are used
"""
input_json_file_name = "OA.json" #default path
output_json_file_name = "edfa_config.json"
param_field ="params"
gain_min_field = "gain_min"
gain_max_field = "gain_flatmax"
gain_ripple_field = "dfg"
nf_ripple_field = "nf_ripple"
nf_fit_coeff = "nf_fit_coeff"
nf_model_field = "nf_model"
nf_model_enabled_field = "enabled"
nf_min_field ="nf_min"
nf_max_field = "nf_max"
def read_file(field, file_name):
"""read and format the 96 channels txt files describing the amplifier NF and ripple
convert dfg into gain ripple by removing the mean component
"""
#with open(path + file_name,'r') as this_file:
# data = this_file.read()
#data.strip()
#data = re.sub(r"([0-9])([ ]{1,3})([0-9-+])",r"\1,\3",data)
#data = list(data.split(","))
#data = [float(x) for x in data]
data = np.loadtxt(file_name)
if field == gain_ripple_field or field == nf_ripple_field:
#consider ripple excursion only to avoid redundant information
#because the max flat_gain is already given by the 'gain_flat' field in json
#remove the mean component
data = data - data.mean()
data = data.tolist()
return data
def nf_model(amp_dict):
if amp_dict[nf_model_field][nf_model_enabled_field] == True:
gain_min = amp_dict[gain_min_field]
gain_max = amp_dict[gain_max_field]
nf_min = amp_dict[nf_model_field][nf_min_field]
nf_max = amp_dict[nf_model_field][nf_max_field]
#use NF estimation model based on NFmin and NFmax in json OA file
delta_p = 5 #max power dB difference between 1st and 2nd stage coils
#dB g1a = (1st stage gain) - (internal voa attenuation)
g1a_min = gain_min - (gain_max-gain_min) - delta_p
g1a_max = gain_max - delta_p
#nf1 and nf2 are the nf of the 1st and 2nd stage coils
#calculate nf1 and nf2 values that solve nf_[min/max] = nf1 + nf2 / g1a[min/max]
nf2 = lin2db((db2lin(nf_min) - db2lin(nf_max)) / (1/db2lin(g1a_max)-1/db2lin(g1a_min)))
nf1 = lin2db(db2lin(nf_min)- db2lin(nf2)/db2lin(g1a_max)) #expression (1)
""" now checking and recalculating the results:
recalculate delta_p to check it is within [1-6] boundaries
This is to check that the nf_min and nf_max values from the json file
make sense. If not a warning is printed """
if nf1 < 4:
print('1st coil nf calculated value {} is too low: revise inputs'.format(nf1))
if nf2 < nf1 + 0.3 or nf2 > nf1 + 2:
"""nf2 should be with [nf1+0.5 - nf1 +2] boundaries
there shouldn't be very high nf differences between 2 coils
=> recalculate delta_p
"""
nf2 = max(nf2, nf1+0.3)
nf2 = min(nf2, nf1+2)
g1a_max = lin2db(db2lin(nf2) / (db2lin(nf_min) - db2lin(nf1))) #use expression (1)
delta_p = gain_max - g1a_max
g1a_min = gain_min - (gain_max-gain_min) - delta_p
if delta_p < 1 or delta_p > 6:
#delta_p should be > 1dB and < 6dB => consider user warning if not
print('1st coil vs 2nd coil calculated DeltaP {} is not valid: revise inputs'
.format(delta_p))
#check the calculated values for nf1 & nf2:
nf_min_calc = lin2db(db2lin(nf1) + db2lin(nf2)/db2lin(g1a_max))
nf_max_calc = lin2db(db2lin(nf1) + db2lin(nf2)/db2lin(g1a_min))
if (abs(nf_min_calc-nf_min) > 0.01) or (abs(nf_max_calc-nf_max) > 0.01):
print('nf model calculation failed with nf_min {} and nf_max {} calculated'
.format(nf_min_calc, nf_max_calc))
print('do not use the generated edfa_config.json file')
else :
(nf1, nf2, delta_p) = (0, 0, 0)
return (nf1, nf2, delta_p)
def input_json(path):
"""read the json input file and add all the 96 channels txt files
create the output json file with output_json_file_name"""
with open(path,'r') as edfa_json_file:
amp_text = edfa_json_file.read()
amp_dict = json.loads(amp_text)
for k, v in amp_dict.items():
if re.search(r'.txt$',str(v)) :
amp_dict[k] = read_file(k, v)
#calculate nf of 1st and 2nd coil for the nf_model if 'enabled'==true
(nf1, nf2, delta_p) = nf_model(amp_dict)
#rename nf_min and nf_max in nf1 and nf2 after the nf model calculation:
del amp_dict[nf_model_field][nf_min_field]
del amp_dict[nf_model_field][nf_max_field]
amp_dict[nf_model_field]['nf1'] = nf1
amp_dict[nf_model_field]['nf2'] = nf2
amp_dict[nf_model_field]['delta_p'] = delta_p
#rename dfg into gain_ripple after removing the average part:
amp_dict['gain_ripple'] = amp_dict.pop(gain_ripple_field)
new_amp_dict = {}
new_amp_dict[param_field] = amp_dict
amp_text = json.dumps(new_amp_dict, indent=4)
#print(amp_text)
with open(output_json_file_name,'w') as edfa_json_file:
edfa_json_file.write(amp_text)
if __name__ == '__main__':
if len(sys.argv) == 2:
path = sys.argv[1]
else:
path = input_json_file_name
input_json(path)

View File

@@ -1,313 +0,0 @@
{
"params": {
"gain_flatmax": 25,
"gain_min": 15,
"p_max": 21,
"nf_fit_coeff": [
0.000168241,
0.0469961,
0.0359549,
5.82851
],
"nf_ripple": [
-0.3110761646066259,
-0.3110761646066259,
-0.31110274831665313,
-0.31419329378173544,
-0.3172854168606314,
-0.32037911876162584,
-0.3233255190215882,
-0.31624321721895354,
-0.30915729645781326,
-0.30206775396360075,
-0.2949045115165272,
-0.26632156113294336,
-0.23772399031437283,
-0.20911178784023846,
-0.18048410390821285,
-0.14379944379052215,
-0.10709599992470213,
-0.07037375788020579,
-0.03372858157230583,
-0.015660302006048,
0.0024172385953583004,
0.020504047353947653,
0.03860013139908377,
0.05670549786742816,
0.07482015390297145,
0.0838762040768461,
0.09284481475528361,
0.1018180306253394,
0.11079585523492333,
0.1020395478432815,
0.09310160456603413,
0.08415906712621996,
0.07521193198077789,
0.0676340601339394,
0.06005437964543287,
0.052470799141237305,
0.044883315610536455,
0.037679759069084225,
0.03047647598902483,
0.02326948274513522,
0.01605877647020772,
0.021248462316134083,
0.02657315875107553,
0.03190060058247842,
0.03723078993416436,
0.04256372893215024,
0.047899419704645264,
0.03915515813685565,
0.030289222542492025,
0.021418708618354456,
0.012573926129294415,
0.006240488799898697,
-9.622162373026585e-05,
-0.006436207679519103,
-0.012779471908040341,
-0.02038153550619876,
-0.027999803010447587,
-0.035622012697103154,
-0.043236398934156144,
-0.04493583574805963,
-0.04663615264317309,
-0.048337350303318156,
-0.050039429413028365,
-0.051742390657545205,
-0.05342028484370278,
-0.05254242298580185,
-0.05166410580536087,
-0.05078533294804249,
-0.04990610405914272,
-0.05409792133358102,
-0.05832916277634124,
-0.06256260169582961,
-0.06660356886269536,
-0.04779792991567815,
-0.028982516728038848,
-0.010157321677553965,
0.00861320615127981,
0.01913736978785662,
0.029667009055877668,
0.04020212822983975,
0.050742731588695494,
0.061288823415841555,
0.07184040799914815,
0.1043252636301016,
0.13687829834471027,
0.1694483010211072,
0.202035284929368,
0.23624619427167134,
0.27048596623174515,
0.30474360397422756,
0.3390191214858807,
0.36358851509924695,
0.38814205928193013,
0.41270842850729195,
0.4372876328262819,
0.4372876328262819
],
"dgt": [
2.714526681131686,
2.705443819238505,
2.6947834587664494,
2.6841217449620203,
2.6681935771243177,
2.6521732021128046,
2.630396440815385,
2.602860350286428,
2.5696460593920065,
2.5364027376452056,
2.499446286796604,
2.4587748041127506,
2.414398437185221,
2.3699990328716107,
2.322373696229342,
2.271520771371253,
2.2174389328192197,
2.16337565384239,
2.1183028432496016,
2.082225099873648,
2.055100772005235,
2.0279625371819305,
2.0008103857988204,
1.9736443063300082,
1.9482128147680253,
1.9245345552113182,
1.9026104247588487,
1.8806927939516411,
1.862235672444246,
1.847275503201129,
1.835814081380705,
1.824381436842932,
1.8139629377087627,
1.8045606557581335,
1.7961751115773796,
1.7877868031023945,
1.7793941781790852,
1.7709972329654864,
1.7625959636196327,
1.7541903672600494,
1.7459181197626403,
1.737780757913635,
1.7297783508684146,
1.7217732861435076,
1.7137640932265894,
1.7057507692361864,
1.6918150918099673,
1.6719047669939942,
1.6460167077689267,
1.6201194134191075,
1.5986915141218316,
1.5817353179379183,
1.569199764184379,
1.5566577309558969,
1.545374152761467,
1.5353620432989845,
1.5266220576235803,
1.5178910621476225,
1.5097346239790443,
1.502153039909686,
1.495145456062699,
1.488134243479226,
1.48111939735681,
1.474100442252211,
1.4670307626366115,
1.4599103316162523,
1.45273959485914,
1.445565137158368,
1.4340878115214444,
1.418273806730323,
1.3981208704326855,
1.3779439775587023,
1.3598972673004606,
1.3439818461440451,
1.3301807335621048,
1.316383926863083,
1.3040618749785347,
1.2932153453410835,
1.2838336236692311,
1.2744470198196236,
1.2650555289898042,
1.2556591482982988,
1.2428104897182262,
1.2264996957264114,
1.2067249615595257,
1.1869318618366975,
1.1672278304018044,
1.1476135933863398,
1.1280891949729075,
1.108555289615659,
1.0895983485572227,
1.0712204022764056,
1.0534217504465226,
1.0356155337864215,
1.017807767853702,
1.0
],
"nf_model": {
"enabled": true,
"nf1": 5.727887800964238,
"nf2": 7.727887800964238,
"delta_p": 5.238350271545567
},
"gain_ripple": [
0.1359703369791596,
0.11822862697916037,
0.09542181697916163,
0.06245819697916133,
0.02602813697916062,
-0.0036199830208403228,
-0.018326963020840026,
-0.0246928330208398,
-0.016792253020838643,
-0.0028138630208403015,
0.017572956979162058,
0.038328296979159404,
0.054956336979159914,
0.0670723869791594,
0.07091459697916136,
0.07094413697916124,
0.07114372697916238,
0.07533675697916209,
0.08731066697916035,
0.10313984697916112,
0.12276252697916235,
0.14239527697916188,
0.15945681697916214,
0.1739275269791598,
0.1767381569791624,
0.17037189697916233,
0.15216302697916007,
0.13114358697916018,
0.10802383697916085,
0.08548825697916129,
0.06916723697916183,
0.05848224697916038,
0.05447361697916264,
0.05154489697916276,
0.04946107697915991,
0.04717897697916129,
0.04551704697916037,
0.04467697697916151,
0.04072968697916224,
0.03285456697916089,
0.023488786979161347,
0.01659282697915998,
0.013321846979160057,
0.011234826979162449,
0.01030063697916006,
0.00936596697916059,
0.00874012697916271,
0.00842583697916055,
0.006965146979162284,
0.0040435869791615175,
0.0007104669791608842,
-0.0015763130208377163,
-0.006936193020838033,
-0.016475303020840215,
-0.028748483020837767,
-0.039618433020837784,
-0.051112303020840244,
-0.06468462302083822,
-0.07868024302083754,
-0.09101254302083817,
-0.10103437302083762,
-0.11041488302083735,
-0.11916081302083725,
-0.12789859302083784,
-0.1353792530208402,
-0.14160178302083892,
-0.1455411330208385,
-0.1484450830208388,
-0.14823350302084037,
-0.14591937302083835,
-0.1409032730208395,
-0.13525493302083902,
-0.1279646530208396,
-0.11963431302083904,
-0.11089282302084058,
-0.1027863830208382,
-0.09717347302083823,
-0.09343261302083761,
-0.0913487130208388,
-0.08906007302083907,
-0.0865687230208394,
-0.08407607302083875,
-0.07844600302084004,
-0.06968090302083851,
-0.05947139302083926,
-0.05095282302083959,
-0.042428283020839785,
-0.03218106302083967,
-0.01819858302084043,
-0.0021726530208390216,
0.01393231697916164,
0.028098946979159933,
0.040326236979161934,
0.05257029697916238,
0.06479749697916048,
0.07704745697916238
]
}
}

View File

@@ -1,90 +0,0 @@
#!/usr/bin/env
"""
@author: briantaylor
@author: giladgoldfarb
@author: jeanluc-auge
Transmission setup example:
reads from network json (default = examples/edfa/edfa_example_network.json)
propagates a 96 channels comb
"""
from argparse import ArgumentParser
from json import load
from sys import exit
from pathlib import Path
from logging import getLogger, basicConfig, INFO, ERROR, DEBUG
from matplotlib.pyplot import show, axis, figure, title
from networkx import (draw_networkx_nodes, draw_networkx_edges,
draw_networkx_labels, dijkstra_path)
from gnpy.core import network_from_json, build_network
from gnpy.core.elements import Transceiver, Fiber, Edfa
from gnpy.core.info import SpectralInformation, Channel, Power
#from gnpy.core.algorithms import closed_paths
logger = getLogger(__package__ or __file__)
def format_si(spectral_infos):
return '\n'.join([
f'#{idx} Carrier(frequency={c.frequency},\n power=Power(signal={c.power.signal}, nli={c.power.nli}, ase={c.power.ase}))'
for idx, si in sorted(set(spectral_infos))
for c in set(si.carriers)
])
logger = getLogger('gnpy.core')
def main(args):
with open(args.filename) as f:
json_data = load(f)
network = network_from_json(json_data)
build_network(network)
spacing = 0.05 #THz
si = SpectralInformation() # !! SI units W, Hz
si = si.update(carriers=tuple(Channel(f, (191.3+spacing*f)*1e12,
32e9, 0.15, Power(1e-3, 0, 0)) for f in range(1,97)))
trx = [n for n in network.nodes() if isinstance(n, Transceiver)]
source, sink = trx[0], trx[1]
path = dijkstra_path(network, source, sink)
print(f'There are {len(path)} network elements between {source} and {sink}')
for el in path:
si = el(si)
print(el)
nodelist = [n for n in network.nodes() if isinstance(n, (Transceiver, Fiber))]
pathnodes = [n for n in path if isinstance(n, (Transceiver, Fiber))]
edgelist = [(u, v) for u, v in zip(pathnodes, pathnodes[1:])]
node_color = ['#ff0000' if n is source or n is sink else
'#900000' if n in path else '#ffdfdf'
for n in nodelist]
edge_color = ['#ff9090' if u in path and v in path else '#ababab'
for u, v in edgelist]
labels = {n: n.location.city if isinstance(n, Transceiver) else ''
for n in pathnodes}
fig = figure()
pos = {n: (n.lng, n.lat) for n in nodelist}
kwargs = {'figure': fig, 'pos': pos}
plot = draw_networkx_nodes(network, nodelist=nodelist, node_color=node_color, **kwargs)
draw_networkx_edges(network, edgelist=edgelist, edge_color=edge_color, **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **kwargs)
title(f'Propagating from {source.loc.city} to {sink.loc.city}')
axis('off')
show()
parser = ArgumentParser()
parser.add_argument('filename', nargs='?', type=Path,
default= Path(__file__).parent / 'edfa/edfa_example_network.json')
parser.add_argument('-v', '--verbose', action='count')
if __name__ == '__main__':
args = parser.parse_args()
level = {1: INFO, 2: DEBUG}.get(args.verbose, ERROR)
logger.setLevel(level)
basicConfig()
exit(main(args))

View File

@@ -0,0 +1,8 @@
'''
GNPy is an open-source, community-developed library for building route planning and optimization tools in real-world mesh optical networks. It is based on the Gaussian Noise Model.
Signal propagation is implemented in :py:mod:`.core`.
Path finding and spectrum assignment is in :py:mod:`.topology`.
Various tools and auxiliary code, including the JSON I/O handling, is in
:py:mod:`.tools`.
'''

View File

@@ -1,6 +1,9 @@
#!/usr/bin/env python3
'''
Simulation of signal propagation in the DWDM network
from . import elements
from .execute import *
from .network import *
from .utils import *
Optical signals, as defined via :class:`.info.SpectralInformation`, enter
:py:mod:`.elements` which compute how these signals are affected as they travel
through the :py:mod:`.network`.
The simulation is controlled via :py:mod:`.parameters` and implemented mainly
via :py:mod:`.science_utils`.
'''

15
gnpy/core/ansi_escapes.py Normal file
View File

@@ -0,0 +1,15 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.core.ansi_escapes
======================
A random subset of ANSI terminal escape codes for colored messages
'''
red = '\x1b[1;31;40m'
blue = '\x1b[1;34;40m'
cyan = '\x1b[1;36;40m'
yellow = '\x1b[1;33;40m'
reset = '\x1b[0m'

File diff suppressed because it is too large Load Diff

73
gnpy/core/equipment.py Normal file
View File

@@ -0,0 +1,73 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.core.equipment
===================
This module contains functionality for specifying equipment.
'''
from gnpy.core.utils import automatic_nch, db2lin
from gnpy.core.exceptions import EquipmentConfigError
def trx_mode_params(equipment, trx_type_variety='', trx_mode='', error_message=False):
"""return the trx and SI parameters from eqpt_config for a given type_variety and mode (ie format)"""
trx_params = {}
default_si_data = equipment['SI']['default']
try:
trxs = equipment['Transceiver']
# if called from path_requests_run.py, trx_mode is filled with None when not specified by user
# if called from transmission_main.py, trx_mode is ''
if trx_mode is not None:
mode_params = next(mode for trx in trxs
if trx == trx_type_variety
for mode in trxs[trx].mode
if mode['format'] == trx_mode)
trx_params = {**mode_params}
# sanity check: spacing baudrate must be smaller than min spacing
if trx_params['baud_rate'] > trx_params['min_spacing']:
raise EquipmentConfigError(f'Inconsistency in equipment library:\n Transpoder "{trx_type_variety}" mode "{trx_params["format"]}" ' +
f'has baud rate {trx_params["baud_rate"]*1e-9} GHz greater than min_spacing {trx_params["min_spacing"]*1e-9}.')
else:
mode_params = {"format": "undetermined",
"baud_rate": None,
"OSNR": None,
"bit_rate": None,
"roll_off": None,
"tx_osnr": None,
"min_spacing": None,
"cost": None}
trx_params = {**mode_params}
trx_params['f_min'] = equipment['Transceiver'][trx_type_variety].frequency['min']
trx_params['f_max'] = equipment['Transceiver'][trx_type_variety].frequency['max']
# TODO: novel automatic feature maybe unwanted if spacing is specified
# trx_params['spacing'] = _automatic_spacing(trx_params['baud_rate'])
# temp = trx_params['spacing']
# print(f'spacing {temp}')
except StopIteration:
if error_message:
raise EquipmentConfigError(f'Could not find transponder "{trx_type_variety}" with mode "{trx_mode}" in equipment library')
else:
# default transponder charcteristics
# mainly used with transmission_main_example.py
trx_params['f_min'] = default_si_data.f_min
trx_params['f_max'] = default_si_data.f_max
trx_params['baud_rate'] = default_si_data.baud_rate
trx_params['spacing'] = default_si_data.spacing
trx_params['OSNR'] = None
trx_params['bit_rate'] = None
trx_params['cost'] = None
trx_params['roll_off'] = default_si_data.roll_off
trx_params['tx_osnr'] = default_si_data.tx_osnr
trx_params['min_spacing'] = None
nch = automatic_nch(trx_params['f_min'], trx_params['f_max'], trx_params['spacing'])
trx_params['nb_channel'] = nch
print(f'There are {nch} channels propagating')
trx_params['power'] = db2lin(default_si_data.power_dbm) * 1e-3
return trx_params

37
gnpy/core/exceptions.py Normal file
View File

@@ -0,0 +1,37 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.core.exceptions
====================
Exceptions thrown by other gnpy modules
'''
class ConfigurationError(Exception):
'''User-provided configuration contains an error'''
class EquipmentConfigError(ConfigurationError):
'''Incomplete or wrong configuration within the equipment library'''
class NetworkTopologyError(ConfigurationError):
'''Topology of user-provided network is wrong'''
class ServiceError(Exception):
'''Service of user-provided request is wrong'''
class DisjunctionError(ServiceError):
'''Disjunction of user-provided request can not be satisfied'''
class SpectrumError(Exception):
'''Spectrum errors of the program'''
class ParametersError(ConfigurationError):
'''Incomplete or wrong configurations within parameters json'''

View File

@@ -1,2 +0,0 @@
#!/usr/bin/env python3

View File

@@ -1,50 +1,57 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.core.info
==============
This module contains classes for modelling :class:`SpectralInformation`.
'''
from collections import namedtuple
from gnpy.core.utils import automatic_nch, lin2db
class ConvenienceAccess:
def __init_subclass__(cls):
for abbrev, field in getattr(cls, '_ABBREVS', {}).items():
setattr(cls, abbrev, property(lambda self, f=field: getattr(self, f)))
def update(self, **kwargs):
for abbrev, field in getattr(self, '_ABBREVS', {}).items():
if abbrev in kwargs:
kwargs[field] = kwargs.pop(abbrev)
return self._replace(**kwargs)
class Power(namedtuple('Power', 'signal nli ase')):
"""carriers power in W"""
class Power(namedtuple('Power', 'signal nonlinear_interference amplified_spontaneous_emission'), ConvenienceAccess):
_ABBREVS = {'nli': 'nonlinear_interference',
'ase': 'amplified_spontaneous_emission',}
class Channel(namedtuple('Channel', 'channel_number frequency baud_rate roll_off power'), ConvenienceAccess):
_ABBREVS = {'channel': 'channel_number',
'num_chan': 'channel_number',
'ffs': 'frequency',
'freq': 'frequency',}
class Channel(namedtuple('Channel', 'channel_number frequency baud_rate roll_off power chromatic_dispersion pmd')):
""" Class containing the parameters of a WDM signal.
class SpectralInformation(namedtuple('SpectralInformation', 'carriers'), ConvenienceAccess):
def __new__(cls, *carriers):
return super().__new__(cls, carriers)
:param channel_number: channel number in the WDM grid
:param frequency: central frequency of the signal (Hz)
:param baud_rate: the symbol rate of the signal (Baud)
:param roll_off: the roll off of the signal. It is a pure number between 0 and 1
:param power (gnpy.core.info.Power): power of signal, ASE noise and NLI (W)
:param chromatic_dispersion: chromatic dispersion (s/m)
:param pmd: polarization mode dispersion (s)
"""
if __name__ == '__main__':
class Pref(namedtuple('Pref', 'p_span0, p_spani, neq_ch ')):
"""noiseless reference power in dBm:
p_span0: inital target carrier power
p_spani: carrier power after element i
neq_ch: equivalent channel count in dB"""
class SpectralInformation(namedtuple('SpectralInformation', 'pref carriers')):
def __new__(cls, pref, carriers):
return super().__new__(cls, pref, carriers)
def create_input_spectral_information(f_min, f_max, roll_off, baud_rate, power, spacing):
# pref in dB : convert power lin into power in dB
pref = lin2db(power * 1e3)
nb_channel = automatic_nch(f_min, f_max, spacing)
si = SpectralInformation(
Channel(1, 193.95e12, 32e9, 0.15, # 193.95 THz, 32 Gbaud
Power(1e-3, 1e-6, 1e-6)), # 1 mW, 1uW, 1uW
Channel(1, 195.95e12, 32e9, 0.15, # 195.95 THz, 32 Gbaud
Power(1.2e-3, 1e-6, 1e-6)), # 1.2 mW, 1uW, 1uW
pref=Pref(pref, pref, lin2db(nb_channel)),
carriers=[
Channel(f, (f_min + spacing * f),
baud_rate, roll_off, Power(power, 0, 0), 0, 0) for f in range(1, nb_channel + 1)
]
)
si = SpectralInformation()
spacing = 0.05 #THz
si = si.update(carriers=tuple(Channel(f+1, 191.3+spacing*(f+1), 32e9, 0.15, Power(1e-3, f, 1)) for f in range(96)))
print(f'si = {si}')
print(f'si = {si.carriers[0].power.nli}')
print(f'si = {si.carriers[20].power.nli}')
"""
si2 = si.update(carriers=tuple(c.update(power = c.power.update(nli = c.power.nli * 1e5))
for c in si.carriers))
print(f'si2 = {si2}')
"""
return si

View File

@@ -1,114 +1,484 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from networkx import DiGraph
'''
gnpy.core.network
=================
from gnpy.core import elements
from gnpy.core.elements import Fiber, Edfa, Transceiver, Roadm
from gnpy.core.units import UNITS
Working with networks which consist of network elements
'''
MAX_SPAN_LENGTH = 125000
TARGET_SPAN_LENGTH = 100000
MIN_SPAN_LENGTH = 75000
from scipy.interpolate import interp1d
from operator import attrgetter
from gnpy.core import ansi_escapes, elements
from gnpy.core.exceptions import ConfigurationError, NetworkTopologyError
from gnpy.core.utils import round2float, convert_length
from collections import namedtuple
def network_from_json(json_data):
# NOTE|dutc: we could use the following, but it would tie our data format
# too closely to the graph library
# from networkx import node_link_graph
g = DiGraph()
for el_config in json_data['elements']:
g.add_node(getattr(elements, el_config['type'])(el_config))
nodes = {k.uid: k for k in g.nodes()}
def edfa_nf(gain_target, variety_type, equipment):
amp_params = equipment['Edfa'][variety_type]
amp = elements.Edfa(
uid='calc_NF',
params=amp_params.__dict__,
operational={
'gain_target': gain_target,
'tilt_target': 0
}
)
amp.pin_db = 0
amp.nch = 88
return amp._calc_nf(True)
for cx in json_data['connections']:
from_node, to_node = cx['from_node'], cx['to_node']
g.add_edge(nodes[from_node], nodes[to_node])
return g
def select_edfa(raman_allowed, gain_target, power_target, equipment, uid, restrictions=None):
"""amplifer selection algorithm
@Orange Jean-Luc Augé
"""
Edfa_list = namedtuple('Edfa_list', 'variety power gain_min nf')
TARGET_EXTENDED_GAIN = equipment['Span']['default'].target_extended_gain
def calculate_new_length(fiber_length):
result = (fiber_length, 1)
if fiber_length > MAX_SPAN_LENGTH:
n_spans = int(fiber_length // TARGET_SPAN_LENGTH)
# for roadm restriction only: create a dict including not allowed for design amps
# because main use case is to have specific radm amp which are not allowed for ILA
# with the auto design
edfa_dict = {name: amp for (name, amp) in equipment['Edfa'].items()
if restrictions is None or name in restrictions}
length1 = fiber_length / (n_spans+1)
result1 = (length1, n_spans+1)
delta1 = TARGET_SPAN_LENGTH-length1
pin = power_target - gain_target
length2 = fiber_length / n_spans
delta2 = length2-TARGET_SPAN_LENGTH
result2 = (length2, n_spans)
# create 2 list of available amplifiers with relevant attributes for their selection
if length1<MIN_SPAN_LENGTH and length2<MAX_SPAN_LENGTH:
result = result2
elif length2>MAX_SPAN_LENGTH and length1>MIN_SPAN_LENGTH:
result = result1
# edfa list with:
# extended gain min allowance of 3dB: could be parametrized, but a bit complex
# extended gain max allowance TARGET_EXTENDED_GAIN is coming from eqpt_config.json
# power attribut include power AND gain limitations
edfa_list = [Edfa_list(
variety=edfa_variety,
power=min(
pin
+ edfa.gain_flatmax
+ TARGET_EXTENDED_GAIN,
edfa.p_max
)
- power_target,
gain_min=gain_target + 3
- edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment))
for edfa_variety, edfa in edfa_dict.items()
if ((edfa.allowed_for_design or restrictions is not None) and not edfa.raman)]
# consider a Raman list because of different gain_min requirement:
# do not allow extended gain min for Raman
raman_list = [Edfa_list(
variety=edfa_variety,
power=min(
pin
+ edfa.gain_flatmax
+ TARGET_EXTENDED_GAIN,
edfa.p_max
)
- power_target,
gain_min=gain_target
- edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment))
for edfa_variety, edfa in edfa_dict.items()
if (edfa.allowed_for_design and edfa.raman)] \
if raman_allowed else []
# merge raman and edfa lists
amp_list = edfa_list + raman_list
# filter on min gain limitation:
acceptable_gain_min_list = [x for x in amp_list if x.gain_min > 0]
if len(acceptable_gain_min_list) < 1:
# do not take this empty list into account for the rest of the code
# but issue a warning to the user and do not consider Raman
# Raman below min gain should not be allowed because i is meant to be a design requirement
# and raman padding at the amplifier input is impossible!
if len(edfa_list) < 1:
raise ConfigurationError(f'auto_design could not find any amplifier \
to satisfy min gain requirement in node {uid} \
please increase span fiber padding')
else:
if delta1 < delta2:
result = result1
else:
result = result2
# TODO: convert to logging
print(
f'{ansi_escapes.red}WARNING:{ansi_escapes.reset} target gain in node {uid} is below all available amplifiers min gain: \
amplifier input padding will be assumed, consider increase span fiber padding instead'
)
acceptable_gain_min_list = edfa_list
return result
# filter on gain+power limitation:
# this list checks both the gain and the power requirement
# because of the way .power is calculated in the list
acceptable_power_list = [x for x in acceptable_gain_min_list if x.power > 0]
if len(acceptable_power_list) < 1:
# no amplifier satisfies the required power, so pick the highest power(s):
power_max = max(acceptable_gain_min_list, key=attrgetter('power')).power
# check and pick if other amplifiers may have a similar gain/power
# allow a 0.3dB power range
# this allows to chose an amplifier with a better NF subsequentely
acceptable_power_list = [x for x in acceptable_gain_min_list
if x.power - power_max > -0.3]
def split_fiber(network, fiber):
new_length, n_spans = calculate_new_length(fiber.length)
prev_node = fiber
if n_spans > 1:
next_nodes = [_ for _ in network.successors(fiber)]
for next_node in next_nodes:
network.remove_edge(fiber, next_node)
# gain and power requirements are resolved,
# =>chose the amp with the best NF among the acceptable ones:
selected_edfa = min(acceptable_power_list, key=attrgetter('nf')) # filter on NF
# check what are the gain and power limitations of this amp
power_reduction = round(min(selected_edfa.power, 0), 2)
if power_reduction < -0.5:
print(
f'{ansi_escapes.red}WARNING:{ansi_escapes.reset} target gain and power in node {uid}\n \
is beyond all available amplifiers capabilities and/or extended_gain_range:\n\
a power reduction of {power_reduction} is applied\n'
)
new_params_length = new_length / UNITS[fiber.params.length_units]
config = {'uid':fiber.uid, 'type': 'Fiber', 'metadata': fiber.__dict__['metadata'], \
'params': fiber.__dict__['params']}
fiber.uid = config['uid'] + '_1'
fiber.length = new_length
fiber.loss = fiber.loss_coef * fiber.length
return selected_edfa.variety, power_reduction
for i in range(2, n_spans+1):
new_config = dict(config)
new_config['uid'] = new_config['uid'] + '_' + str(i)
new_config['params'].length = new_params_length
new_node = Fiber(new_config)
network.add_node(new_node)
network.add_edge(prev_node, new_node)
network = add_egress_amplifier(network, prev_node)
prev_node = new_node
for next_node in next_nodes:
network.add_edge(prev_node, next_node)
def target_power(network, node, equipment): # get_fiber_dp
SPAN_LOSS_REF = 20
POWER_SLOPE = 0.3
dp_range = list(equipment['Span']['default'].delta_power_range_db)
node_loss = span_loss(network, node)
try:
dp = round2float((node_loss - SPAN_LOSS_REF) * POWER_SLOPE, dp_range[2])
dp = max(dp_range[0], dp)
dp = min(dp_range[1], dp)
except KeyError:
raise ConfigurationError(f'invalid delta_power_range_db definition in eqpt_config[Span]'
f'delta_power_range_db: [lower_bound, upper_bound, step]')
if isinstance(node, elements.Roadm):
dp = 0
return dp
def prev_node_generator(network, node):
"""fused spans interest:
iterate over all predecessors while they are Fused or Fiber type"""
try:
prev_node = next(n for n in network.predecessors(node))
except StopIteration:
raise NetworkTopologyError(f'Node {node.uid} is not properly connected, please check network topology')
# yield and re-iterate
if isinstance(prev_node, elements.Fused) or isinstance(node, elements.Fused):
yield prev_node
yield from prev_node_generator(network, prev_node)
else:
StopIteration
def next_node_generator(network, node):
"""fused spans interest:
iterate over all successors while they are Fused or Fiber type"""
try:
next_node = next(n for n in network.successors(node))
except StopIteration:
raise NetworkTopologyError('Node {node.uid} is not properly connected, please check network topology')
# yield and re-iterate
if isinstance(next_node, elements.Fused) or isinstance(node, elements.Fused):
yield next_node
yield from next_node_generator(network, next_node)
else:
StopIteration
def span_loss(network, node):
"""Fused span interest:
return the total span loss of all the fibers spliced by a Fused node"""
loss = node.loss if node.passive else 0
try:
prev_node = next(n for n in network.predecessors(node))
if isinstance(prev_node, elements.Fused):
loss += sum(n.loss for n in prev_node_generator(network, node))
except StopIteration:
pass
try:
next_node = next(n for n in network.successors(node))
if isinstance(next_node, elements.Fused):
loss += sum(n.loss for n in next_node_generator(network, node))
except StopIteration:
pass
return loss
def find_first_node(network, node):
"""Fused node interest:
returns the 1st node at the origin of a succession of fused nodes
(aka no amp in between)"""
this_node = node
for this_node in prev_node_generator(network, node):
pass
return this_node
def find_last_node(network, node):
"""Fused node interest:
returns the last node in a succession of fused nodes
(aka no amp in between)"""
this_node = node
for this_node in next_node_generator(network, node):
pass
return this_node
def set_amplifier_voa(amp, power_target, power_mode):
VOA_MARGIN = 1 # do not maximize the VOA optimization
if amp.out_voa is None:
if power_mode:
voa = min(amp.params.p_max - power_target,
amp.params.gain_flatmax - amp.effective_gain)
voa = max(round2float(max(voa, 0), 0.5) - VOA_MARGIN, 0) if amp.params.out_voa_auto else 0
amp.delta_p = amp.delta_p + voa
amp.effective_gain = amp.effective_gain + voa
else:
voa = 0 # no output voa optimization in gain mode
amp.out_voa = voa
def set_egress_amplifier(network, roadm, equipment, pref_total_db):
power_mode = equipment['Span']['default'].power_mode
next_oms = (n for n in network.successors(roadm) if not isinstance(n, elements.Transceiver))
for oms in next_oms:
# go through all the OMS departing from the Roadm
node = roadm
prev_node = roadm
next_node = oms
# if isinstance(next_node, elements.Fused): #support ROADM wo egress amp for metro applications
# node = find_last_node(next_node)
# next_node = next(n for n in network.successors(node))
# next_node = find_last_node(next_node)
prev_dp = getattr(node.params, 'target_pch_out_db', 0)
dp = prev_dp
prev_voa = 0
voa = 0
while True:
# go through all nodes in the OMS (loop until next Roadm instance)
if isinstance(node, elements.Edfa):
node_loss = span_loss(network, prev_node)
voa = node.out_voa if node.out_voa else 0
if node.delta_p is None:
dp = target_power(network, next_node, equipment)
else:
dp = node.delta_p
gain_from_dp = node_loss + dp - prev_dp + prev_voa
if node.effective_gain is None or power_mode:
gain_target = gain_from_dp
else: # gain mode with effective_gain
gain_target = node.effective_gain
dp = prev_dp - node_loss + gain_target
power_target = pref_total_db + dp
raman_allowed = False
if isinstance(prev_node, elements.Fiber):
max_fiber_lineic_loss_for_raman = \
equipment['Span']['default'].max_fiber_lineic_loss_for_raman
raman_allowed = prev_node.params.loss_coef < max_fiber_lineic_loss_for_raman
# implementation of restrictions on roadm boosters
if isinstance(prev_node, elements.Roadm):
if prev_node.restrictions['booster_variety_list']:
restrictions = prev_node.restrictions['booster_variety_list']
else:
restrictions = None
elif isinstance(next_node, elements.Roadm):
# implementation of restrictions on roadm preamp
if next_node.restrictions['preamp_variety_list']:
restrictions = next_node.restrictions['preamp_variety_list']
else:
restrictions = None
else:
restrictions = None
if node.params.type_variety == '':
edfa_variety, power_reduction = select_edfa(raman_allowed, gain_target, power_target, equipment, node.uid, restrictions)
extra_params = equipment['Edfa'][edfa_variety]
node.params.update_params(extra_params.__dict__)
dp += power_reduction
gain_target += power_reduction
elif node.params.raman and not raman_allowed:
print(f'{ansi_escapes.red}WARNING{ansi_escapes.reset}: raman is used in node {node.uid}\n but fiber lineic loss is above threshold\n')
node.delta_p = dp if power_mode else None
node.effective_gain = gain_target
set_amplifier_voa(node, power_target, power_mode)
if isinstance(next_node, elements.Roadm) or isinstance(next_node, elements.Transceiver):
break
prev_dp = dp
prev_voa = voa
prev_node = node
node = next_node
# print(f'{node.uid}')
next_node = next(n for n in network.successors(node))
network = add_egress_amplifier(network, prev_node)
return network
def add_egress_amplifier(network, node):
next_nodes = [n for n in network.successors(node)
if not (isinstance(n, Edfa) or isinstance(n, Transceiver))]
i = 1
for next_node in next_nodes:
if not (isinstance(n, elements.Transceiver) or isinstance(n, elements.Fused) or isinstance(n, elements.Edfa))]
# no amplification for fused spans or TRX
for i, next_node in enumerate(next_nodes):
network.remove_edge(node, next_node)
amp = elements.Edfa(
uid=f'Edfa{i}_{node.uid}',
params={},
metadata={
'location': {
'latitude': (node.lat * 2 + next_node.lat * 2) / 4,
'longitude': (node.lng * 2 + next_node.lng * 2) / 4,
'city': node.loc.city,
'region': node.loc.region,
}
},
operational={
'gain_target': None,
'tilt_target': 0,
})
network.add_node(amp)
if isinstance(node, elements.Fiber):
edgeweight = node.params.length
else:
edgeweight = 0.01
network.add_edge(node, amp, weight=edgeweight)
network.add_edge(amp, next_node, weight=0.01)
uid = 'Edfa' + str(i)+ '_' + str(node.uid)
metadata = next_node.metadata
operational = {'gain_target': node.loss, 'tilt_target': 0}
edfa_config_json = 'edfa_config.json'
config = {'uid':uid, 'type': 'Edfa', 'metadata': metadata, \
'config_from_json': edfa_config_json, 'operational': operational}
new_edfa = Edfa(config)
network.add_node(new_edfa)
network.add_edge(node,new_edfa)
network.add_edge(new_edfa, next_node)
i +=1
return network
def calculate_new_length(fiber_length, bounds, target_length):
if fiber_length < bounds.stop:
return fiber_length, 1
def build_network(network):
fibers = [f for f in network.nodes() if isinstance(f, Fiber)]
n_spans = int(fiber_length // target_length)
length1 = fiber_length / (n_spans + 1)
delta1 = target_length - length1
result1 = (length1, n_spans + 1)
length2 = fiber_length / n_spans
delta2 = length2 - target_length
result2 = (length2, n_spans)
if (bounds.start <= length1 <= bounds.stop) and not(bounds.start <= length2 <= bounds.stop):
result = result1
elif (bounds.start <= length2 <= bounds.stop) and not(bounds.start <= length1 <= bounds.stop):
result = result2
else:
result = result1 if delta1 < delta2 else result2
return result
def split_fiber(network, fiber, bounds, target_length, equipment):
new_length, n_spans = calculate_new_length(fiber.params.length, bounds, target_length)
if n_spans == 1:
return
try:
next_node = next(network.successors(fiber))
prev_node = next(network.predecessors(fiber))
except StopIteration:
raise NetworkTopologyError(f'Fiber {fiber.uid} is not properly connected, please check network topology')
network.remove_node(fiber)
fiber.params.length = new_length
f = interp1d([prev_node.lng, next_node.lng], [prev_node.lat, next_node.lat])
xpos = [prev_node.lng + (next_node.lng - prev_node.lng) * (n + 1) / (n_spans + 1) for n in range(n_spans)]
ypos = f(xpos)
for span, lng, lat in zip(range(n_spans), xpos, ypos):
new_span = elements.Fiber(uid=f'{fiber.uid}_({span+1}/{n_spans})',
type_variety=fiber.type_variety,
metadata={
'location': {
'latitude': lat,
'longitude': lng,
'city': fiber.loc.city,
'region': fiber.loc.region,
}
},
params=fiber.params.asdict())
if isinstance(prev_node, elements.Fiber):
edgeweight = prev_node.params.length
else:
edgeweight = 0.01
network.add_edge(prev_node, new_span, weight=edgeweight)
prev_node = new_span
if isinstance(prev_node, elements.Fiber):
edgeweight = prev_node.params.length
else:
edgeweight = 0.01
network.add_edge(prev_node, next_node, weight=edgeweight)
def add_connector_loss(network, fibers, default_con_in, default_con_out, EOL):
for fiber in fibers:
network = split_fiber(network, fiber)
if fiber.params.con_in is None:
fiber.params.con_in = default_con_in
if fiber.params.con_out is None:
fiber.params.con_out = default_con_out
next_node = next(n for n in network.successors(fiber))
if not isinstance(next_node, elements.Fused):
fiber.params.con_out += EOL
roadms = [r for r in network.nodes() if isinstance(r, Roadm)]
def add_fiber_padding(network, fibers, padding):
"""last_fibers = (fiber for n in network.nodes()
if not (isinstance(n, elements.Fiber) or isinstance(n, elements.Fused))
for fiber in network.predecessors(n)
if isinstance(fiber, elements.Fiber))"""
for fiber in fibers:
this_span_loss = span_loss(network, fiber)
try:
next_node = next(network.successors(fiber))
except StopIteration:
raise NetworkTopologyError(f'Fiber {fiber.uid} is not properly connected, please check network topology')
if this_span_loss < padding and not (isinstance(next_node, elements.Fused)):
# add a padding att_in at the input of the 1st fiber:
# address the case when several fibers are spliced together
first_fiber = find_first_node(network, fiber)
# in order to support no booster , fused might be placed
# just after a roadm: need to check that first_fiber is really a fiber
if isinstance(first_fiber, elements.Fiber):
if first_fiber.params.att_in is None:
first_fiber.params.att_in = padding - this_span_loss
else:
first_fiber.params.att_in = first_fiber.params.att_in + padding - this_span_loss
def build_network(network, equipment, pref_ch_db, pref_total_db):
default_span_data = equipment['Span']['default']
max_length = int(convert_length(default_span_data.max_length, default_span_data.length_units))
min_length = max(int(default_span_data.padding / 0.2 * 1e3), 50_000)
bounds = range(min_length, max_length)
target_length = max(min_length, 90_000)
default_con_in = default_span_data.con_in
default_con_out = default_span_data.con_out
padding = default_span_data.padding
# set roadm loss for gain_mode before to build network
fibers = [f for f in network.nodes() if isinstance(f, elements.Fiber)]
add_connector_loss(network, fibers, default_con_in, default_con_out, default_span_data.EOL)
add_fiber_padding(network, fibers, padding)
# don't group split fiber and add amp in the same loop
# =>for code clarity (at the expense of speed):
for fiber in fibers:
split_fiber(network, fiber, bounds, target_length, equipment)
amplified_nodes = [n for n in network.nodes() if isinstance(n, elements.Fiber) or isinstance(n, elements.Roadm)]
for node in amplified_nodes:
add_egress_amplifier(network, node)
roadms = [r for r in network.nodes() if isinstance(r, elements.Roadm)]
for roadm in roadms:
add_egress_amplifier(network, roadm)
set_egress_amplifier(network, roadm, equipment, pref_total_db)
# support older json input topology wo Roadms:
if len(roadms) == 0:
trx = [t for t in network.nodes() if isinstance(t, elements.Transceiver)]
for t in trx:
set_egress_amplifier(network, t, equipment, pref_total_db)

View File

@@ -1,60 +0,0 @@
#! /bin/usr/python3
from uuid import uuid4
from gnpy.core.utils import load_json
class ConfigStruct:
def __init__(self, **config):
if config is None:
return None
if 'config_from_json' in config:
json_config = load_json(config['config_from_json'])
self.set_config_attr(json_config)
self.set_config_attr(config)
def set_config_attr(self, config):
for k, v in config.items():
setattr(self, k, ConfigStruct(**v)
if isinstance(v, dict) else v)
def __repr__(self):
return f'{self.__dict__}'
class Node:
def __init__(self, config=None):
self.config = ConfigStruct(**config)
if self.config is None or not hasattr(self.config, 'uid'):
self.uid = uuid4()
else:
self.uid = self.config.uid
if hasattr(self.config, 'params'):
self.params = self.config.params
if hasattr(self.config, 'metadata'):
self.metadata = self.config.metadata
if hasattr(self.config, 'operational'):
self.operational = self.config.operational
@property
def coords(self):
return tuple(self.lng, self.lat)
@property
def location(self):
return self.config.metadata.location
@property
def loc(self): # Aliases .location
return self.location
@property
def lng(self):
return self.config.metadata.location.longitude
@property
def lat(self):
return self.config.metadata.location.latitude

287
gnpy/core/parameters.py Normal file
View File

@@ -0,0 +1,287 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.core.parameters
====================
This module contains all parameters to configure standard network elements.
"""
from scipy.constants import c, pi
from numpy import squeeze, log10, exp
from gnpy.core.utils import db2lin, convert_length
from gnpy.core.exceptions import ParametersError
class Parameters:
def asdict(self):
class_dict = self.__class__.__dict__
instance_dict = self.__dict__
new_dict = {}
for key in class_dict:
if isinstance(class_dict[key], property):
new_dict[key] = instance_dict['_' + key]
return new_dict
class PumpParams(Parameters):
def __init__(self, power, frequency, propagation_direction):
self._power = power
self._frequency = frequency
self._propagation_direction = propagation_direction
@property
def power(self):
return self._power
@property
def frequency(self):
return self._frequency
@property
def propagation_direction(self):
return self._propagation_direction
class RamanParams(Parameters):
def __init__(self, **kwargs):
self._flag_raman = kwargs['flag_raman']
self._space_resolution = kwargs['space_resolution'] if 'space_resolution' in kwargs else None
self._tolerance = kwargs['tolerance'] if 'tolerance' in kwargs else None
@property
def flag_raman(self):
return self._flag_raman
@property
def space_resolution(self):
return self._space_resolution
@property
def tolerance(self):
return self._tolerance
class NLIParams(Parameters):
def __init__(self, **kwargs):
self._nli_method_name = kwargs['nli_method_name']
self._wdm_grid_size = kwargs['wdm_grid_size']
self._dispersion_tolerance = kwargs['dispersion_tolerance']
self._phase_shift_tolerance = kwargs['phase_shift_tolerance']
self._f_cut_resolution = None
self._f_pump_resolution = None
self._computed_channels = kwargs['computed_channels'] if 'computed_channels' in kwargs else None
@property
def nli_method_name(self):
return self._nli_method_name
@property
def wdm_grid_size(self):
return self._wdm_grid_size
@property
def dispersion_tolerance(self):
return self._dispersion_tolerance
@property
def phase_shift_tolerance(self):
return self._phase_shift_tolerance
@property
def f_cut_resolution(self):
return self._f_cut_resolution
@f_cut_resolution.setter
def f_cut_resolution(self, f_cut_resolution):
self._f_cut_resolution = f_cut_resolution
@property
def f_pump_resolution(self):
return self._f_pump_resolution
@f_pump_resolution.setter
def f_pump_resolution(self, f_pump_resolution):
self._f_pump_resolution = f_pump_resolution
@property
def computed_channels(self):
return self._computed_channels
class SimParams(Parameters):
def __init__(self, **kwargs):
try:
if 'nli_parameters' in kwargs:
self._nli_params = NLIParams(**kwargs['nli_parameters'])
else:
self._nli_params = None
if 'raman_parameters' in kwargs:
self._raman_params = RamanParams(**kwargs['raman_parameters'])
else:
self._raman_params = None
except KeyError as e:
raise ParametersError(f'Simulation parameters must include {e}. Configuration: {kwargs}')
@property
def nli_params(self):
return self._nli_params
@property
def raman_params(self):
return self._raman_params
class FiberParams(Parameters):
def __init__(self, **kwargs):
try:
self._length = convert_length(kwargs['length'], kwargs['length_units'])
# fixed attenuator for padding
self._att_in = kwargs['att_in'] if 'att_in' in kwargs else 0
# if not defined in the network json connector loss in/out
# the None value will be updated in network.py[build_network]
# with default values from eqpt_config.json[Spans]
self._con_in = kwargs['con_in'] if 'con_in' in kwargs else None
self._con_out = kwargs['con_out'] if 'con_out' in kwargs else None
if 'ref_wavelength' in kwargs:
self._ref_wavelength = kwargs['ref_wavelength']
self._ref_frequency = c / self.ref_wavelength
elif 'ref_frequency' in kwargs:
self._ref_frequency = kwargs['ref_frequency']
self._ref_wavelength = c / self.ref_frequency
else:
self._ref_wavelength = 1550e-9
self._ref_frequency = c / self.ref_wavelength
self._dispersion = kwargs['dispersion'] # s/m/m
self._dispersion_slope = kwargs['dispersion_slope'] if 'dispersion_slope' in kwargs else \
-2 * self._dispersion/self.ref_wavelength # s/m/m/m
self._beta2 = -(self.ref_wavelength ** 2) * self.dispersion / (2 * pi * c) # 1/(m * Hz^2)
# Eq. (3.23) in Abramczyk, Halina. "Dispersion phenomena in optical fibers." Virtual European University
# on Lasers. Available online: http://mitr.p.lodz.pl/evu/lectures/Abramczyk3.pdf
# (accessed on 25 March 2018) (2005).
self._beta3 = ((self.dispersion_slope - (4*pi*c/self.ref_wavelength**3) * self.beta2) /
(2*pi*c/self.ref_wavelength**2)**2)
self._gamma = kwargs['gamma'] # 1/W/m
self._pmd_coef = kwargs['pmd_coef'] # s/sqrt(m)
if type(kwargs['loss_coef']) == dict:
self._loss_coef = squeeze(kwargs['loss_coef']['loss_coef_power']) * 1e-3 # lineic loss dB/m
self._f_loss_ref = squeeze(kwargs['loss_coef']['frequency']) # Hz
else:
self._loss_coef = kwargs['loss_coef'] * 1e-3 # lineic loss dB/m
self._f_loss_ref = 193.5e12 # Hz
self._lin_attenuation = db2lin(self.length * self.loss_coef)
self._lin_loss_exp = self.loss_coef / (10 * log10(exp(1))) # linear power exponent loss Neper/m
self._effective_length = (1 - exp(- self.lin_loss_exp * self.length)) / self.lin_loss_exp
self._asymptotic_length = 1 / self.lin_loss_exp
# raman parameters (not compulsory)
self._raman_efficiency = kwargs['raman_efficiency'] if 'raman_efficiency' in kwargs else None
self._pumps_loss_coef = kwargs['pumps_loss_coef'] if 'pumps_loss_coef' in kwargs else None
except KeyError as e:
raise ParametersError(f'Fiber configurations json must include {e}. Configuration: {kwargs}')
@property
def length(self):
return self._length
@length.setter
def length(self, length):
"""length must be in m"""
self._length = length
@property
def att_in(self):
return self._att_in
@att_in.setter
def att_in(self, att_in):
self._att_in = att_in
@property
def con_in(self):
return self._con_in
@con_in.setter
def con_in(self, con_in):
self._con_in = con_in
@property
def con_out(self):
return self._con_out
@con_out.setter
def con_out(self, con_out):
self._con_out = con_out
@property
def dispersion(self):
return self._dispersion
@property
def dispersion_slope(self):
return self._dispersion_slope
@property
def gamma(self):
return self._gamma
@property
def pmd_coef(self):
return self._pmd_coef
@property
def ref_wavelength(self):
return self._ref_wavelength
@property
def ref_frequency(self):
return self._ref_frequency
@property
def beta2(self):
return self._beta2
@property
def beta3(self):
return self._beta3
@property
def loss_coef(self):
return self._loss_coef
@property
def f_loss_ref(self):
return self._f_loss_ref
@property
def lin_loss_exp(self):
return self._lin_loss_exp
@property
def lin_attenuation(self):
return self._lin_attenuation
@property
def effective_length(self):
return self._effective_length
@property
def asymptotic_length(self):
return self._asymptotic_length
@property
def raman_efficiency(self):
return self._raman_efficiency
@property
def pumps_loss_coef(self):
return self._pumps_loss_coef
def asdict(self):
dictionary = super().asdict()
dictionary['loss_coef'] = self.loss_coef * 1e3
dictionary['length_units'] = 'm'
return dictionary

732
gnpy/core/science_utils.py Normal file
View File

@@ -0,0 +1,732 @@
import numpy as np
from operator import attrgetter
from logging import getLogger
import scipy.constants as ph
from scipy.integrate import solve_bvp
from scipy.integrate import cumtrapz
from scipy.interpolate import interp1d
from scipy.optimize import OptimizeResult
from math import isclose
from gnpy.core.utils import db2lin, lin2db
from gnpy.core.exceptions import EquipmentConfigError
logger = getLogger(__name__)
def propagate_raman_fiber(fiber, *carriers):
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
raman_params = sim_params.raman_params
nli_params = sim_params.nli_params
# apply input attenuation to carriers
attenuation_in = db2lin(fiber.params.con_in + fiber.params.att_in)
chan = []
for carrier in carriers:
pwr = carrier.power
pwr = pwr._replace(signal=pwr.signal / attenuation_in,
nli=pwr.nli / attenuation_in,
ase=pwr.ase / attenuation_in)
carrier = carrier._replace(power=pwr)
chan.append(carrier)
carriers = tuple(f for f in chan)
# evaluate fiber attenuation involving also SRS if required by sim_params
raman_solver = fiber.raman_solver
raman_solver.carriers = carriers
raman_solver.raman_pumps = fiber.raman_pumps
stimulated_raman_scattering = raman_solver.stimulated_raman_scattering
fiber_attenuation = (stimulated_raman_scattering.rho[:, -1])**-2
if not raman_params.flag_raman:
fiber_attenuation = tuple(fiber.params.lin_attenuation for _ in carriers)
# evaluate Raman ASE noise if required by sim_params and if raman pumps are present
if raman_params.flag_raman and fiber.raman_pumps:
raman_ase = raman_solver.spontaneous_raman_scattering.power[:, -1]
else:
raman_ase = tuple(0 for _ in carriers)
# evaluate nli and propagate in fiber
attenuation_out = db2lin(fiber.params.con_out)
nli_solver = fiber.nli_solver
nli_solver.stimulated_raman_scattering = stimulated_raman_scattering
nli_frequencies = []
computed_nli = []
for carrier in (c for c in carriers if c.channel_number in sim_params.nli_params.computed_channels):
resolution_param = frequency_resolution(carrier, carriers, sim_params, fiber)
f_cut_resolution, f_pump_resolution, _, _ = resolution_param
nli_params.f_cut_resolution = f_cut_resolution
nli_params.f_pump_resolution = f_pump_resolution
nli_frequencies.append(carrier.frequency)
computed_nli.append(nli_solver.compute_nli(carrier, *carriers))
new_carriers = []
for carrier, attenuation, rmn_ase in zip(carriers, fiber_attenuation, raman_ase):
carrier_nli = np.interp(carrier.frequency, nli_frequencies, computed_nli)
pwr = carrier.power
pwr = pwr._replace(signal=pwr.signal / attenuation / attenuation_out,
nli=(pwr.nli + carrier_nli) / attenuation / attenuation_out,
ase=((pwr.ase / attenuation) + rmn_ase) / attenuation_out)
new_carriers.append(carrier._replace(power=pwr))
return new_carriers
def frequency_resolution(carrier, carriers, sim_params, fiber):
def _get_freq_res_k_phi(delta_count, grid_size, alpha0, delta_z, beta2, k_tol, phi_tol):
res_phi = _get_freq_res_phase_rotation(delta_count, grid_size, delta_z, beta2, phi_tol)
res_k = _get_freq_res_dispersion_attenuation(delta_count, grid_size, alpha0, beta2, k_tol)
res_dict = {'res_phi': res_phi, 'res_k': res_k}
method = min(res_dict, key=res_dict.get)
return res_dict[method], method, res_dict
def _get_freq_res_dispersion_attenuation(delta_count, grid_size, alpha0, beta2, k_tol):
return k_tol * abs(alpha0) / abs(beta2) / (1 + delta_count) / (4 * np.pi ** 2 * grid_size)
def _get_freq_res_phase_rotation(delta_count, grid_size, delta_z, beta2, phi_tol):
return phi_tol / abs(beta2) / (1 + delta_count) / delta_z / (4 * np.pi ** 2 * grid_size)
grid_size = sim_params.nli_params.wdm_grid_size
delta_z = sim_params.raman_params.space_resolution
alpha0 = fiber.alpha0()
beta2 = fiber.params.beta2
k_tol = sim_params.nli_params.dispersion_tolerance
phi_tol = sim_params.nli_params.phase_shift_tolerance
f_pump_resolution, method_f_pump, res_dict_pump = \
_get_freq_res_k_phi(0, grid_size, alpha0, delta_z, beta2, k_tol, phi_tol)
f_cut_resolution = {}
method_f_cut = {}
res_dict_cut = {}
for cut_carrier in carriers:
delta_number = cut_carrier.channel_number - carrier.channel_number
delta_count = abs(delta_number)
f_res, method, res_dict = \
_get_freq_res_k_phi(delta_count, grid_size, alpha0, delta_z, beta2, k_tol, phi_tol)
f_cut_resolution[f'delta_{delta_number}'] = f_res
method_f_cut[delta_number] = method
res_dict_cut[delta_number] = res_dict
return [f_cut_resolution, f_pump_resolution, (method_f_cut, method_f_pump), (res_dict_cut, res_dict_pump)]
def raised_cosine_comb(f, *carriers):
""" Returns an array storing the PSD of a WDM comb of raised cosine shaped
channels at the input frequencies defined in array f
:param f: numpy array of frequencies in Hz
:param carriers: namedtuple describing the WDM comb
:return: PSD of the WDM comb evaluated over f
"""
psd = np.zeros(np.shape(f))
for carrier in carriers:
f_nch = carrier.frequency
g_ch = carrier.power.signal / carrier.baud_rate
ts = 1 / carrier.baud_rate
passband = (1 - carrier.roll_off) / (2 / carrier.baud_rate)
stopband = (1 + carrier.roll_off) / (2 / carrier.baud_rate)
ff = np.abs(f - f_nch)
tf = ff - passband
if carrier.roll_off == 0:
psd = np.where(tf <= 0, g_ch, 0.) + psd
else:
psd = g_ch * (np.where(tf <= 0, 1., 0.) + 1 / 2 * (1 + np.cos(np.pi * ts / carrier.roll_off * tf)) *
np.where(tf > 0, 1., 0.) * np.where(np.abs(ff) <= stopband, 1., 0.)) + psd
return psd
class Simulation:
_shared_dict = {}
def __init__(self):
if type(self) == Simulation:
raise NotImplementedError('Simulation cannot be instatiated')
@classmethod
def set_params(cls, sim_params):
cls._shared_dict['sim_params'] = sim_params
@classmethod
def get_simulation(cls):
self = cls.__new__(cls)
return self
@property
def sim_params(self):
return self._shared_dict['sim_params']
class SpontaneousRamanScattering:
def __init__(self, frequency, z, power):
self.frequency = frequency
self.z = z
self.power = power
class StimulatedRamanScattering:
def __init__(self, frequency, z, rho, power):
self.frequency = frequency
self.z = z
self.rho = rho
self.power = power
class RamanSolver:
def __init__(self, fiber=None):
""" Initialize the Raman solver object.
:param fiber: instance of elements.py/Fiber.
:param carriers: tuple of carrier objects
:param raman_pumps: tuple containing pumps characteristics
"""
self._fiber = fiber
self._carriers = None
self._raman_pumps = None
self._stimulated_raman_scattering = None
self._spontaneous_raman_scattering = None
@property
def fiber(self):
return self._fiber
@property
def carriers(self):
return self._carriers
@carriers.setter
def carriers(self, carriers):
self._carriers = carriers
self._spontaneous_raman_scattering = None
self._stimulated_raman_scattering = None
@property
def raman_pumps(self):
return self._raman_pumps
@raman_pumps.setter
def raman_pumps(self, raman_pumps):
self._raman_pumps = raman_pumps
self._stimulated_raman_scattering = None
@property
def stimulated_raman_scattering(self):
if self._stimulated_raman_scattering is None:
self.calculate_stimulated_raman_scattering(self.carriers, self.raman_pumps)
return self._stimulated_raman_scattering
@property
def spontaneous_raman_scattering(self):
if self._spontaneous_raman_scattering is None:
self.calculate_spontaneous_raman_scattering(self.carriers, self.raman_pumps)
return self._spontaneous_raman_scattering
def calculate_spontaneous_raman_scattering(self, carriers, raman_pumps):
raman_efficiency = self.fiber.params.raman_efficiency
temperature = self.fiber.operational['temperature']
logger.debug('Start computing fiber Spontaneous Raman Scattering')
power_spectrum, freq_array, prop_direct, bn_array = self._compute_power_spectrum(carriers, raman_pumps)
alphap_fiber = self.fiber.alpha(freq_array)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
# z propagation axis
z_array = self.stimulated_raman_scattering.z
ase_bc = np.zeros(freq_array.shape)
# calculate ase power
int_spontaneous_raman = self._int_spontaneous_raman(z_array, self._stimulated_raman_scattering.power,
alphap_fiber, freq_array, cr, freq_diff, ase_bc,
bn_array, temperature)
spontaneous_raman_scattering = SpontaneousRamanScattering(freq_array, z_array, int_spontaneous_raman.x)
logger.debug("Spontaneous Raman Scattering evaluated successfully")
self._spontaneous_raman_scattering = spontaneous_raman_scattering
@staticmethod
def _compute_power_spectrum(carriers, raman_pumps=None):
"""
Rearrangement of spectral and Raman pump information to make them compatible with Raman solver
:param carriers: a tuple of namedtuples describing the transmitted channels
:param raman_pumps: a namedtuple describing the Raman pumps
:return:
"""
# Signal power spectrum
pow_array = np.array([])
f_array = np.array([])
noise_bandwidth_array = np.array([])
for carrier in sorted(carriers, key=attrgetter('frequency')):
f_array = np.append(f_array, carrier.frequency)
pow_array = np.append(pow_array, carrier.power.signal)
ref_bw = carrier.baud_rate
noise_bandwidth_array = np.append(noise_bandwidth_array, ref_bw)
propagation_direction = np.ones(len(f_array))
# Raman pump power spectrum
if raman_pumps:
for pump in raman_pumps:
pow_array = np.append(pow_array, pump.power)
f_array = np.append(f_array, pump.frequency)
direction = +1 if pump.propagation_direction.lower() == 'coprop' else -1
propagation_direction = np.append(propagation_direction, direction)
noise_bandwidth_array = np.append(noise_bandwidth_array, ref_bw)
# Final sorting
ind = np.argsort(f_array)
f_array = f_array[ind]
pow_array = pow_array[ind]
propagation_direction = propagation_direction[ind]
return pow_array, f_array, propagation_direction, noise_bandwidth_array
def _int_spontaneous_raman(self, z_array, raman_matrix, alphap_fiber, freq_array,
cr_raman_matrix, freq_diff, ase_bc, bn_array, temperature):
spontaneous_raman_scattering = OptimizeResult()
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
dx = sim_params.raman_params.space_resolution
h = ph.value('Planck constant')
kb = ph.value('Boltzmann constant')
power_ase = np.nan * np.ones(raman_matrix.shape)
int_pump = cumtrapz(raman_matrix, z_array, dx=dx, axis=1, initial=0)
for f_ind, f_ase in enumerate(freq_array):
cr_raman = cr_raman_matrix[f_ind, :]
vibrational_loss = f_ase / freq_array[:f_ind]
eta = 1 / (np.exp((h * freq_diff[f_ind, f_ind + 1:]) / (kb * temperature)) - 1)
int_fiber_loss = -alphap_fiber[f_ind] * z_array
int_raman_loss = np.sum((cr_raman[:f_ind] * vibrational_loss * int_pump[:f_ind, :].transpose()).transpose(),
axis=0)
int_raman_gain = np.sum((cr_raman[f_ind + 1:] * int_pump[f_ind + 1:, :].transpose()).transpose(), axis=0)
int_gain_loss = int_fiber_loss + int_raman_gain + int_raman_loss
new_ase = np.sum((cr_raman[f_ind + 1:] * (1 + eta) * raman_matrix[f_ind + 1:, :].transpose()).transpose()
* h * f_ase * bn_array[f_ind], axis=0)
bc_evolution = ase_bc[f_ind] * np.exp(int_gain_loss)
ase_evolution = np.exp(int_gain_loss) * cumtrapz(new_ase *
np.exp(-int_gain_loss), z_array, dx=dx, initial=0)
power_ase[f_ind, :] = bc_evolution + ase_evolution
spontaneous_raman_scattering.x = 2 * power_ase
return spontaneous_raman_scattering
def calculate_stimulated_raman_scattering(self, carriers, raman_pumps):
""" Returns stimulated Raman scattering solution including
fiber gain/loss profile.
:return: None
"""
# fiber parameters
fiber_length = self.fiber.params.length
raman_efficiency = self.fiber.params.raman_efficiency
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
if not sim_params.raman_params.flag_raman:
raman_efficiency['cr'] = np.zeros(len(raman_efficiency['cr']))
# raman solver parameters
z_resolution = sim_params.raman_params.space_resolution
tolerance = sim_params.raman_params.tolerance
logger.debug('Start computing fiber Stimulated Raman Scattering')
power_spectrum, freq_array, prop_direct, _ = self._compute_power_spectrum(carriers, raman_pumps)
alphap_fiber = self.fiber.alpha(freq_array)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
# z propagation axis
z = np.arange(0, fiber_length + 1, z_resolution)
def ode_function(z, p):
return self._ode_stimulated_raman(z, p, alphap_fiber, freq_array, cr, prop_direct)
def boundary_residual(ya, yb):
return self._residuals_stimulated_raman(ya, yb, power_spectrum, prop_direct)
initial_guess_conditions = self._initial_guess_stimulated_raman(z, power_spectrum, alphap_fiber, prop_direct)
# ODE SOLVER
bvp_solution = solve_bvp(ode_function, boundary_residual, z, initial_guess_conditions, tol=tolerance)
rho = (bvp_solution.y.transpose() / power_spectrum).transpose()
rho = np.sqrt(rho) # From power attenuation to field attenuation
stimulated_raman_scattering = StimulatedRamanScattering(freq_array, bvp_solution.x, rho, bvp_solution.y)
self._stimulated_raman_scattering = stimulated_raman_scattering
def _residuals_stimulated_raman(self, ya, yb, power_spectrum, prop_direct):
computed_boundary_value = np.zeros(ya.size)
for index, direction in enumerate(prop_direct):
if direction == +1:
computed_boundary_value[index] = ya[index]
else:
computed_boundary_value[index] = yb[index]
return power_spectrum - computed_boundary_value
def _initial_guess_stimulated_raman(self, z, power_spectrum, alphap_fiber, prop_direct):
""" Computes the initial guess knowing the boundary conditions
:param z: patial axis [m]. numpy array
:param power_spectrum: power in each frequency slice [W].
Frequency axis is defined by freq_array. numpy array
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m].
Frequency defined by freq_array. numpy array
:param prop_direct: indicates the propagation direction of each power slice in power_spectrum:
+1 for forward propagation and -1 for backward propagation. Frequency defined by freq_array. numpy array
:return: power_guess: guess on the initial conditions [W].
The first ndarray index identifies the frequency slice,
the second ndarray index identifies the step in z. ndarray
"""
power_guess = np.empty((power_spectrum.size, z.size))
for f_index, power_slice in enumerate(power_spectrum):
if prop_direct[f_index] == +1:
power_guess[f_index, :] = np.exp(-alphap_fiber[f_index] * z) * power_slice
else:
power_guess[f_index, :] = np.exp(-alphap_fiber[f_index] * z[::-1]) * power_slice
return power_guess
def _ode_stimulated_raman(self, z, power_spectrum, alphap_fiber, freq_array, cr_raman_matrix, prop_direct):
""" Aim of ode_raman is to implement the set of ordinary differential equations (ODEs)
describing the Raman effect.
:param z: spatial axis (unused).
:param power_spectrum: power in each frequency slice [W].
Frequency axis is defined by freq_array. numpy array. Size n
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m].
Frequency defined by freq_array. numpy array. Size n
:param freq_array: reference frequency axis [Hz]. numpy array. Size n
:param cr_raman: Cr(f) Raman gain efficiency variation in frequency [1/W/m].
Frequency defined by freq_array. numpy ndarray. Size nxn
:param prop_direct: indicates the propagation direction of each power slice in power_spectrum:
+1 for forward propagation and -1 for backward propagation.
Frequency defined by freq_array. numpy array. Size n
:return: dP/dz: the power variation in dz [W/m]. numpy array. Size n
"""
dpdz = np.nan * np.ones(power_spectrum.shape)
for f_ind, power in enumerate(power_spectrum):
cr_raman = cr_raman_matrix[f_ind, :]
vibrational_loss = freq_array[f_ind] / freq_array[:f_ind]
for z_ind, power_sample in enumerate(power):
raman_gain = np.sum(cr_raman[f_ind + 1:] * power_spectrum[f_ind + 1:, z_ind])
raman_loss = np.sum(vibrational_loss * cr_raman[:f_ind] * power_spectrum[:f_ind, z_ind])
dpdz_element = prop_direct[f_ind] * (-alphap_fiber[f_ind] + raman_gain - raman_loss) * power_sample
dpdz[f_ind][z_ind] = dpdz_element
return np.vstack(dpdz)
class NliSolver:
""" This class implements the NLI models.
Model and method can be specified in `sim_params.nli_params.method`.
List of implemented methods:
'gn_model_analytic': brute force triple integral solution
'ggn_spectrally_separated_xpm_spm': XPM plus SPM
"""
def __init__(self, fiber=None):
""" Initialize the Nli solver object.
:param fiber: instance of elements.py/Fiber.
"""
self._fiber = fiber
self._stimulated_raman_scattering = None
@property
def fiber(self):
return self._fiber
@property
def stimulated_raman_scattering(self):
return self._stimulated_raman_scattering
@stimulated_raman_scattering.setter
def stimulated_raman_scattering(self, stimulated_raman_scattering):
self._stimulated_raman_scattering = stimulated_raman_scattering
def compute_nli(self, carrier, *carriers):
""" Compute NLI power generated by the WDM comb `*carriers` on the channel under test `carrier`
at the end of the fiber span.
"""
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
if 'gn_model_analytic' == sim_params.nli_params.nli_method_name.lower():
carrier_nli = self._gn_analytic(carrier, *carriers)
elif 'ggn_spectrally_separated' in sim_params.nli_params.nli_method_name.lower():
eta_matrix = self._compute_eta_matrix(carrier, *carriers)
carrier_nli = self._carrier_nli_from_eta_matrix(eta_matrix, carrier, *carriers)
else:
raise ValueError(f'Method {sim_params.nli_params.method_nli} not implemented.')
return carrier_nli
@staticmethod
def _carrier_nli_from_eta_matrix(eta_matrix, carrier, *carriers):
carrier_nli = 0
for pump_carrier_1 in carriers:
for pump_carrier_2 in carriers:
carrier_nli += eta_matrix[pump_carrier_1.channel_number - 1, pump_carrier_2.channel_number - 1] * \
pump_carrier_1.power.signal * pump_carrier_2.power.signal
carrier_nli *= carrier.power.signal
return carrier_nli
def _compute_eta_matrix(self, carrier_cut, *carriers):
cut_index = carrier_cut.channel_number - 1
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
# Matrix initialization
matrix_size = max(carriers, key=lambda x: getattr(x, 'channel_number')).channel_number
eta_matrix = np.zeros(shape=(matrix_size, matrix_size))
# SPM
logger.debug(f'Start computing SPM on channel #{carrier_cut.channel_number}')
# SPM GGN
if 'ggn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._generalized_spectrally_separated_spm(carrier_cut)
# SPM GN
elif 'gn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._gn_analytic(carrier_cut, *[carrier_cut])
eta_matrix[cut_index, cut_index] = partial_nli / (carrier_cut.power.signal**3)
# XPM
for pump_carrier in carriers:
pump_index = pump_carrier.channel_number - 1
if not (cut_index == pump_index):
logger.debug(f'Start computing XPM on channel #{carrier_cut.channel_number} '
f'from channel #{pump_carrier.channel_number}')
# XPM GGN
if 'ggn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._generalized_spectrally_separated_xpm(carrier_cut, pump_carrier)
# XPM GGN
elif 'gn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._gn_analytic(carrier_cut, *[pump_carrier])
eta_matrix[pump_index, pump_index] = partial_nli /\
(carrier_cut.power.signal * pump_carrier.power.signal**2)
return eta_matrix
# Methods for computing GN-model
def _gn_analytic(self, carrier, *carriers):
""" Computes the nonlinear interference power on a single carrier.
The method uses eq. 120 from arXiv:1209.0394.
:param carrier: the signal under analysis
:param carriers: the full WDM comb
:return: carrier_nli: the amount of nonlinear interference in W on the carrier under analysis
"""
beta2 = self.fiber.params.beta2
gamma = self.fiber.params.gamma
effective_length = self.fiber.params.effective_length
asymptotic_length = self.fiber.params.asymptotic_length
g_nli = 0
for interfering_carrier in carriers:
g_interfearing = interfering_carrier.power.signal / interfering_carrier.baud_rate
g_signal = carrier.power.signal / carrier.baud_rate
g_nli += g_interfearing**2 * g_signal \
* _psi(carrier, interfering_carrier, beta2=beta2, asymptotic_length=asymptotic_length)
g_nli *= (16.0 / 27.0) * (gamma * effective_length) ** 2 /\
(2 * np.pi * abs(beta2) * asymptotic_length)
carrier_nli = carrier.baud_rate * g_nli
return carrier_nli
# Methods for computing the GGN-model
def _generalized_spectrally_separated_spm(self, carrier):
gamma = self.fiber.params.gamma
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
f_cut_resolution = sim_params.nli_params.f_cut_resolution['delta_0']
f_eval = carrier.frequency
g_cut = (carrier.power.signal / carrier.baud_rate)
spm_nli = carrier.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_cut ** 3 * \
self._generalized_psi(carrier, carrier, f_eval, f_cut_resolution, f_cut_resolution)
return spm_nli
def _generalized_spectrally_separated_xpm(self, carrier_cut, pump_carrier):
gamma = self.fiber.params.gamma
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
delta_index = pump_carrier.channel_number - carrier_cut.channel_number
f_cut_resolution = sim_params.nli_params.f_cut_resolution[f'delta_{delta_index}']
f_pump_resolution = sim_params.nli_params.f_pump_resolution
f_eval = carrier_cut.frequency
g_pump = (pump_carrier.power.signal / pump_carrier.baud_rate)
g_cut = (carrier_cut.power.signal / carrier_cut.baud_rate)
frequency_offset_threshold = self._frequency_offset_threshold(pump_carrier.baud_rate)
if abs(carrier_cut.frequency - pump_carrier.frequency) <= frequency_offset_threshold:
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_pump**2 * g_cut * \
2 * self._generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution, f_pump_resolution)
else:
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_pump**2 * g_cut * \
2 * self._fast_generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution)
return xpm_nli
def _fast_generalized_psi(self, carrier_cut, pump_carrier, f_eval, f_cut_resolution):
""" It computes the generalized psi function similarly to the one used in the GN model
:return: generalized_psi
"""
# Fiber parameters
alpha0 = self.fiber.alpha0(f_eval)
beta2 = self.fiber.params.beta2
beta3 = self.fiber.params.beta3
f_ref_beta = self.fiber.params.ref_frequency
z = self.stimulated_raman_scattering.z
frequency_rho = self.stimulated_raman_scattering.frequency
rho_norm = self.stimulated_raman_scattering.rho * np.exp(np.abs(alpha0) * z / 2)
if len(frequency_rho) == 1:
def rho_function(f): return rho_norm[0, :]
else:
rho_function = interp1d(frequency_rho, rho_norm, axis=0, fill_value='extrapolate')
rho_norm_pump = rho_function(pump_carrier.frequency)
f1_array = np.array([pump_carrier.frequency - (pump_carrier.baud_rate * (1 + pump_carrier.roll_off) / 2),
pump_carrier.frequency + (pump_carrier.baud_rate * (1 + pump_carrier.roll_off) / 2)])
f2_array = np.arange(carrier_cut.frequency,
carrier_cut.frequency + (carrier_cut.baud_rate * (1 + carrier_cut.roll_off) / 2),
f_cut_resolution) # Only positive f2 is used since integrand_f2 is symmetric
integrand_f1 = np.zeros(len(f1_array))
for f1_index, f1 in enumerate(f1_array):
delta_beta = 4 * np.pi**2 * (f1 - f_eval) * (f2_array - f_eval) * \
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
integrand_f2 = self._generalized_rho_nli(delta_beta, rho_norm_pump, z, alpha0)
integrand_f1[f1_index] = 2 * np.trapz(integrand_f2, f2_array) # 2x since integrand_f2 is symmetric in f2
generalized_psi = 0.5 * sum(integrand_f1) * pump_carrier.baud_rate
return generalized_psi
def _generalized_psi(self, carrier_cut, pump_carrier, f_eval, f_cut_resolution, f_pump_resolution):
""" It computes the generalized psi function similarly to the one used in the GN model
:return: generalized_psi
"""
# Fiber parameters
alpha0 = self.fiber.alpha0(f_eval)
beta2 = self.fiber.params.beta2
beta3 = self.fiber.params.beta3
f_ref_beta = self.fiber.params.ref_frequency
z = self.stimulated_raman_scattering.z
frequency_rho = self.stimulated_raman_scattering.frequency
rho_norm = self.stimulated_raman_scattering.rho * np.exp(np.abs(alpha0) * z / 2)
if len(frequency_rho) == 1:
def rho_function(f): return rho_norm[0, :]
else:
rho_function = interp1d(frequency_rho, rho_norm, axis=0, fill_value='extrapolate')
rho_norm_pump = rho_function(pump_carrier.frequency)
f1_array = np.arange(pump_carrier.frequency - (pump_carrier.baud_rate * (1 + pump_carrier.roll_off) / 2),
pump_carrier.frequency + (pump_carrier.baud_rate * (1 + pump_carrier.roll_off) / 2),
f_pump_resolution)
f2_array = np.arange(carrier_cut.frequency - (carrier_cut.baud_rate * (1 + carrier_cut.roll_off) / 2),
carrier_cut.frequency + (carrier_cut.baud_rate * (1 + carrier_cut.roll_off) / 2),
f_cut_resolution)
psd1 = raised_cosine_comb(f1_array, pump_carrier) * (pump_carrier.baud_rate / pump_carrier.power.signal)
integrand_f1 = np.zeros(len(f1_array))
for f1_index, (f1, psd1_sample) in enumerate(zip(f1_array, psd1)):
f3_array = f1 + f2_array - f_eval
psd2 = raised_cosine_comb(f2_array, carrier_cut) * (carrier_cut.baud_rate / carrier_cut.power.signal)
psd3 = raised_cosine_comb(f3_array, pump_carrier) * (pump_carrier.baud_rate / pump_carrier.power.signal)
ggg = psd1_sample * psd2 * psd3
delta_beta = 4 * np.pi**2 * (f1 - f_eval) * (f2_array - f_eval) * \
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
integrand_f2 = ggg * self._generalized_rho_nli(delta_beta, rho_norm_pump, z, alpha0)
integrand_f1[f1_index] = np.trapz(integrand_f2, f2_array)
generalized_psi = np.trapz(integrand_f1, f1_array)
return generalized_psi
@staticmethod
def _generalized_rho_nli(delta_beta, rho_norm_pump, z, alpha0):
w = 1j * delta_beta - alpha0
generalized_rho_nli = (rho_norm_pump[-1]**2 * np.exp(w * z[-1]) - rho_norm_pump[0]**2 * np.exp(w * z[0])) / w
for z_ind in range(0, len(z) - 1):
derivative_rho = (rho_norm_pump[z_ind + 1]**2 - rho_norm_pump[z_ind]**2) / (z[z_ind + 1] - z[z_ind])
generalized_rho_nli -= derivative_rho * (np.exp(w * z[z_ind + 1]) - np.exp(w * z[z_ind])) / (w**2)
generalized_rho_nli = np.abs(generalized_rho_nli)**2
return generalized_rho_nli
def _frequency_offset_threshold(self, symbol_rate):
k_ref = 5
beta2_ref = 21.3e-27
delta_f_ref = 50e9
rs_ref = 32e9
beta2 = abs(self.fiber.params.beta2)
freq_offset_th = ((k_ref * delta_f_ref) * rs_ref * beta2_ref) / (beta2 * symbol_rate)
return freq_offset_th
def _psi(carrier, interfering_carrier, beta2, asymptotic_length):
"""Calculates eq. 123 from `arXiv:1209.0394 <https://arxiv.org/abs/1209.0394>`__"""
if carrier.channel_number == interfering_carrier.channel_number: # SCI, SPM
psi = np.arcsinh(0.5 * np.pi**2 * asymptotic_length * abs(beta2) * carrier.baud_rate**2)
else: # XCI, XPM
delta_f = carrier.frequency - interfering_carrier.frequency
psi = np.arcsinh(np.pi**2 * asymptotic_length * abs(beta2) *
carrier.baud_rate * (delta_f + 0.5 * interfering_carrier.baud_rate))
psi -= np.arcsinh(np.pi**2 * asymptotic_length * abs(beta2) *
carrier.baud_rate * (delta_f - 0.5 * interfering_carrier.baud_rate))
return psi
def estimate_nf_model(type_variety, gain_min, gain_max, nf_min, nf_max):
if nf_min < -10:
raise EquipmentConfigError(f'Invalid nf_min value {nf_min!r} for amplifier {type_variety}')
if nf_max < -10:
raise EquipmentConfigError(f'Invalid nf_max value {nf_max!r} for amplifier {type_variety}')
# NF estimation model based on nf_min and nf_max
# delta_p: max power dB difference between first and second stage coils
# dB g1a: first stage gain - internal VOA attenuation
# nf1, nf2: first and second stage coils
# calculated by solving nf_{min,max} = nf1 + nf2 / g1a{min,max}
delta_p = 5
g1a_min = gain_min - (gain_max - gain_min) - delta_p
g1a_max = gain_max - delta_p
nf2 = lin2db((db2lin(nf_min) - db2lin(nf_max)) /
(1 / db2lin(g1a_max) - 1 / db2lin(g1a_min)))
nf1 = lin2db(db2lin(nf_min) - db2lin(nf2) / db2lin(g1a_max))
if nf1 < 4:
raise EquipmentConfigError(f'First coil value too low {nf1} for amplifier {type_variety}')
# Check 1 dB < delta_p < 6 dB to ensure nf_min and nf_max values make sense.
# There shouldn't be high nf differences between the two coils:
# nf2 should be nf1 + 0.3 < nf2 < nf1 + 2
# If not, recompute and check delta_p
if not nf1 + 0.3 < nf2 < nf1 + 2:
nf2 = np.clip(nf2, nf1 + 0.3, nf1 + 2)
g1a_max = lin2db(db2lin(nf2) / (db2lin(nf_min) - db2lin(nf1)))
delta_p = gain_max - g1a_max
g1a_min = gain_min - (gain_max - gain_min) - delta_p
if not 1 < delta_p < 11:
raise EquipmentConfigError(f'Computed \N{greek capital letter delta}P invalid \
\n 1st coil vs 2nd coil calculated DeltaP {delta_p:.2f} for \
\n amplifier {type_variety} is not valid: revise inputs \
\n calculated 1st coil NF = {nf1:.2f}, 2nd coil NF = {nf2:.2f}')
# Check calculated values for nf1 and nf2
calc_nf_min = lin2db(db2lin(nf1) + db2lin(nf2) / db2lin(g1a_max))
if not isclose(nf_min, calc_nf_min, abs_tol=0.01):
raise EquipmentConfigError(f'nf_min does not match calc_nf_min, {nf_min} vs {calc_nf_min} for amp {type_variety}')
calc_nf_max = lin2db(db2lin(nf1) + db2lin(nf2) / db2lin(g1a_min))
if not isclose(nf_max, calc_nf_max, abs_tol=0.01):
raise EquipmentConfigError(f'nf_max does not match calc_nf_max, {nf_max} vs {calc_nf_max} for amp {type_variety}')
return nf1, nf2, delta_p

View File

@@ -1,2 +0,0 @@
UNITS = {'m': 1,
'km': 1E3}

View File

@@ -1,71 +1,141 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import json
'''
gnpy.core.utils
===============
This module contains utility functions that are used with gnpy.
'''
from csv import writer
import numpy as np
from numpy import pi, cos, sqrt, log10
from scipy import constants
from gnpy.core.exceptions import ConfigurationError
def load_json(filename):
with open(filename, 'r') as f:
data = json.load(f)
return data
def save_json(obj, filename):
with open(filename, 'w') as f:
json.dump(obj, f)
def c():
def write_csv(obj, filename):
"""
Returns the speed of light in meters per second
Convert dictionary items to a CSV file the dictionary format:
::
{'result category 1':
[
# 1st line of results
{'header 1' : value_xxx,
'header 2' : value_yyy},
# 2nd line of results: same headers, different results
{'header 1' : value_www,
'header 2' : value_zzz}
],
'result_category 2':
[
{},{}
]
}
The generated csv file will be:
::
result_category 1
header 1 header 2
value_xxx value_yyy
value_www value_zzz
result_category 2
...
"""
return 299792458.0
with open(filename, 'w', encoding='utf-8') as f:
w = writer(f)
for data_key, data_list in obj.items():
# main header
w.writerow([data_key])
# sub headers:
headers = [_ for _ in data_list[0].keys()]
w.writerow(headers)
for data_dict in data_list:
w.writerow([_ for _ in data_dict.values()])
def itufs(spacing, startf=191.35, stopf=196.10):
"""Creates an array of frequencies whose default range is
191.35-196.10 THz
def arrange_frequencies(length, start, stop):
"""Create an array of frequencies
:param spacing: Frequency spacing in THz
:param starf: Start frequency in THz
:param stopf: Stop frequency in THz
:type spacing: float
:type startf: float
:type stopf: float
:return an array of frequnecies determined by the spacing parameter
:param length: number of elements
:param start: Start frequency in THz
:param stop: Stop frequency in THz
:type length: integer
:type start: float
:type stop: float
:return: an array of frequencies determined by the spacing parameter
:rtype: numpy.ndarray
"""
return np.arange(startf, stopf + spacing / 2, spacing)
def h():
"""
Returns plank's constant in J*s
"""
return 6.62607004e-34
return np.linspace(start, stop, length)
def lin2db(value):
"""Convert linear unit to logarithmic (dB)
>>> lin2db(0.001)
-30.0
>>> round(lin2db(1.0), 2)
0.0
>>> round(lin2db(1.26), 2)
1.0
>>> round(lin2db(10.0), 2)
10.0
>>> round(lin2db(100.0), 2)
20.0
"""
return 10 * log10(value)
def db2lin(value):
"""Convert logarithimic units to linear
>>> round(db2lin(10.0), 2)
10.0
>>> round(db2lin(20.0), 2)
100.0
>>> round(db2lin(1.0), 2)
1.26
>>> round(db2lin(0.0), 2)
1.0
>>> round(db2lin(-10.0), 2)
0.1
"""
return 10**(value / 10)
def wavelength2freq(value):
""" Converts wavelength units to frequeuncy units.
"""
return c() / value
def round2float(number, step):
step = round(step, 1)
if step >= 0.01:
number = round(number / step, 0)
number = round(number * step, 1)
else:
number = round(number, 2)
return number
wavelength2freq = constants.lambda2nu
freq2wavelength = constants.nu2lambda
def freq2wavelength(value):
""" Converts frequency units to wavelength units.
>>> round(freq2wavelength(191.35e12) * 1e9, 3)
1566.723
>>> round(freq2wavelength(196.1e12) * 1e9, 3)
1528.773
"""
return c() / value
return constants.c / value
def snr_sum(snr, bw, snr_added, bw_added=12.5e9):
snr_added = snr_added - lin2db(bw / bw_added)
snr = -lin2db(db2lin(-snr) + db2lin(-snr_added))
return snr
def deltawl2deltaf(delta_wl, wavelength):
@@ -128,3 +198,101 @@ def rrc(ffs, baud_rate, alpha):
p_inds = np.where(np.logical_and(np.abs(ffs) > 0, np.abs(ffs) < l_lim))
hf[p_inds] = 1
return sqrt(hf)
def merge_amplifier_restrictions(dict1, dict2):
"""Updates contents of dicts recursively
>>> d1 = {'params': {'restrictions': {'preamp_variety_list': [], 'booster_variety_list': []}}}
>>> d2 = {'params': {'target_pch_out_db': -20}}
>>> merge_amplifier_restrictions(d1, d2)
{'params': {'restrictions': {'preamp_variety_list': [], 'booster_variety_list': []}, 'target_pch_out_db': -20}}
>>> d3 = {'params': {'restrictions': {'preamp_variety_list': ['foo'], 'booster_variety_list': ['bar']}}}
>>> merge_amplifier_restrictions(d1, d3)
{'params': {'restrictions': {'preamp_variety_list': [], 'booster_variety_list': []}}}
"""
copy_dict1 = dict1.copy()
for key in dict2:
if key in dict1:
if isinstance(dict1[key], dict):
copy_dict1[key] = merge_amplifier_restrictions(copy_dict1[key], dict2[key])
else:
copy_dict1[key] = dict2[key]
return copy_dict1
def silent_remove(this_list, elem):
"""Remove matching elements from a list without raising ValueError
>>> li = [0, 1]
>>> li = silent_remove(li, 1)
>>> li
[0]
>>> li = silent_remove(li, 1)
>>> li
[0]
"""
try:
this_list.remove(elem)
except ValueError:
pass
return this_list
def automatic_nch(f_min, f_max, spacing):
"""How many channels are available in the spectrum
:param f_min Lowest frequenecy [Hz]
:param f_max Highest frequency [Hz]
:param spacing Channel width [Hz]
:return Number of uniform channels
>>> automatic_nch(191.325e12, 196.125e12, 50e9)
96
>>> automatic_nch(193.475e12, 193.525e12, 50e9)
1
"""
return int((f_max - f_min) // spacing)
def automatic_fmax(f_min, spacing, nch):
"""Find the high-frequenecy boundary of a spectrum
:param f_min Start of the spectrum (lowest frequency edge) [Hz]
:param spacing Grid/channel spacing [Hz]
:param nch Number of channels
:return End of the spectrum (highest frequency) [Hz]
>>> automatic_fmax(191.325e12, 50e9, 96)
196125000000000.0
"""
return f_min + spacing * nch
def convert_length(value, units):
"""Convert length into basic SI units
>>> convert_length(1, 'km')
1000.0
>>> convert_length(2.0, 'km')
2000.0
>>> convert_length(123, 'm')
123.0
>>> convert_length(123.0, 'm')
123.0
>>> convert_length(42.1, 'km')
42100.0
>>> convert_length(666, 'yards')
Traceback (most recent call last):
...
gnpy.core.exceptions.ConfigurationError: Cannot convert length in "yards" into meters
"""
if units == 'm':
return value * 1e0
elif units == 'km':
return value * 1e3
else:
raise ConfigurationError(f'Cannot convert length in "{units}" into meters')

File diff suppressed because it is too large Load Diff

Binary file not shown.

View File

@@ -0,0 +1,160 @@
{
"nf_fit_coeff": [
0.0008,
0.0272,
-0.2249,
6.4902
],
"f_min": 191.35e12,
"f_max": 196.1e12,
"nf_ripple": [
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0
],
"gain_ripple": [
0.15017064489112,
0.14157768006701,
0.00223094639866,
-0.06701528475711,
-0.05982935510889,
-0.01028161641541,
0.02740682579566,
0.02795958961474,
0.00107516750419,
-0.02199015912898,
-0.00877407872698,
0.0453465242881,
0.1204721524288,
0.18936662479061,
0.23826109715241,
0.26956762981574,
0.27836159966498,
0.26941687604691,
0.23579878559464,
0.18147717755444,
0.1191656197655,
0.05921587102177,
0.01509526800668,
-0.01053287269681,
-0.02475397822447,
-0.01847257118928,
-0.00420121440538,
0.01584903685091,
0.0399193886097,
0.04494451423784,
0.04961788107202,
0.03378873534338,
0.01027114740367,
-0.01319618927973,
-0.04962835008375,
-0.0765630234506,
-0.10606051088777,
-0.13550774706866,
-0.15460322445561,
-0.17113588777219,
-0.18053287269681,
-0.18324644053602,
-0.19440221943049,
-0.20897508375209,
-0.23575900335007,
-0.25188965661642,
-0.22244242043552,
-0.15656302345061
],
"dgt": [
2.4553191172498,
2.44342862248888,
2.41879254989742,
2.38192717604575,
2.33147727493671,
2.26678136721453,
2.19013043016015,
2.10336369905543,
2.01414465424155,
1.92915262384742,
1.85543800978691,
1.79748596476494,
1.75428006928365,
1.72461030013125,
1.70379790088896,
1.68845480656382,
1.6761448370895,
1.66286684904577,
1.64799163036252,
1.63068023161292,
1.61073904908309,
1.58973304612691,
1.56750088631614,
1.54578500307573,
1.5242627235492,
1.50335352244996,
1.48420288841848,
1.46637521309853,
1.44977369463316,
1.43476940680732,
1.42089447397912,
1.40864903907609,
1.3966294751726,
1.38430337205545,
1.3710092503689,
1.35690844654118,
1.3405812000038,
1.32210817897091,
1.30069883494415,
1.27657903892303,
1.24931318255134,
1.21911100318577,
1.18632744096844,
1.15209185089701,
1.11575888725852,
1.07773189112355,
1.03941448941778,
1.0
]
}

View File

@@ -0,0 +1,104 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
create_eqpt_sheet.py
====================
XLS parser that can be called to create a "City" column in the "Eqpt" sheet.
If not present in the "Nodes" sheet, the "Type" column will be implicitly
determined based on the topology.
"""
from xlrd import open_workbook
from argparse import ArgumentParser
PARSER = ArgumentParser()
PARSER.add_argument('workbook', nargs='?', default='meshTopologyExampleV2.xls',
help='create the mandatory columns in Eqpt sheet')
def ALL_ROWS(sh, start=0):
return (sh.row(x) for x in range(start, sh.nrows))
class Node:
""" Node element contains uid, list of connected nodes and eqpt type
"""
def __init__(self, uid, to_node):
self.uid = uid
self.to_node = to_node
self.eqpt = None
def __repr__(self):
return f'uid {self.uid} \nto_node {[node for node in self.to_node]}\neqpt {self.eqpt}\n'
def __str__(self):
return f'uid {self.uid} \nto_node {[node for node in self.to_node]}\neqpt {self.eqpt}\n'
def read_excel(input_filename):
""" read excel Nodes and Links sheets and create a dict of nodes with
their to_nodes and type of eqpt
"""
with open_workbook(input_filename) as wobo:
# reading Links sheet
links_sheet = wobo.sheet_by_name('Links')
nodes = {}
for row in ALL_ROWS(links_sheet, start=5):
try:
nodes[row[0].value].to_node.append(row[1].value)
except KeyError:
nodes[row[0].value] = Node(row[0].value, [row[1].value])
try:
nodes[row[1].value].to_node.append(row[0].value)
except KeyError:
nodes[row[1].value] = Node(row[1].value, [row[0].value])
nodes_sheet = wobo.sheet_by_name('Nodes')
for row in ALL_ROWS(nodes_sheet, start=5):
node = row[0].value
eqpt = row[6].value
try:
if eqpt == 'ILA' and len(nodes[node].to_node) != 2:
print(f'Inconsistancy ILA node with degree > 2: {node} ')
exit()
if eqpt == '' and len(nodes[node].to_node) == 2:
nodes[node].eqpt = 'ILA'
elif eqpt == '' and len(nodes[node].to_node) != 2:
nodes[node].eqpt = 'ROADM'
else:
nodes[node].eqpt = eqpt
except KeyError:
print(f'inconsistancy between nodes and links sheet: {node} is not listed in links')
exit()
return nodes
def create_eqt_template(nodes, input_filename):
""" writes list of node A node Z corresponding to Nodes and Links sheets in order
to help user populating Eqpt
"""
output_filename = f'{input_filename[:-4]}_eqpt_sheet.txt'
with open(output_filename, 'w', encoding='utf-8') as my_file:
# print header similar to excel
my_file.write('OPTIONAL\n\n\n\
\t\tNode a egress amp (from a to z)\t\t\t\t\tNode a ingress amp (from z to a) \
\nNode A \tNode Z \tamp type \tatt_in \tamp gain \ttilt \tatt_out\
amp type \tatt_in \tamp gain \ttilt \tatt_out\n')
for node in nodes.values():
if node.eqpt == 'ILA':
my_file.write(f'{node.uid}\t{node.to_node[0]}\n')
if node.eqpt == 'ROADM':
for to_node in node.to_node:
my_file.write(f'{node.uid}\t{to_node}\n')
print(f'File {output_filename} successfully created with Node A - Node Z entries for Eqpt sheet in excel file.')
if __name__ == '__main__':
ARGS = PARSER.parse_args()
create_eqt_template(read_excel(ARGS.workbook), ARGS.workbook)

View File

@@ -0,0 +1,106 @@
{
"nf_ripple": [
0.0
],
"gain_ripple": [
0.0
],
"dgt": [
2.714526681131686,
2.705443819238505,
2.6947834587664494,
2.6841217449620203,
2.6681935771243177,
2.6521732021128046,
2.630396440815385,
2.602860350286428,
2.5696460593920065,
2.5364027376452056,
2.499446286796604,
2.4587748041127506,
2.414398437185221,
2.3699990328716107,
2.322373696229342,
2.271520771371253,
2.2174389328192197,
2.16337565384239,
2.1183028432496016,
2.082225099873648,
2.055100772005235,
2.0279625371819305,
2.0008103857988204,
1.9736443063300082,
1.9482128147680253,
1.9245345552113182,
1.9026104247588487,
1.8806927939516411,
1.862235672444246,
1.847275503201129,
1.835814081380705,
1.824381436842932,
1.8139629377087627,
1.8045606557581335,
1.7961751115773796,
1.7877868031023945,
1.7793941781790852,
1.7709972329654864,
1.7625959636196327,
1.7541903672600494,
1.7459181197626403,
1.737780757913635,
1.7297783508684146,
1.7217732861435076,
1.7137640932265894,
1.7057507692361864,
1.6918150918099673,
1.6719047669939942,
1.6460167077689267,
1.6201194134191075,
1.5986915141218316,
1.5817353179379183,
1.569199764184379,
1.5566577309558969,
1.545374152761467,
1.5353620432989845,
1.5266220576235803,
1.5178910621476225,
1.5097346239790443,
1.502153039909686,
1.495145456062699,
1.488134243479226,
1.48111939735681,
1.474100442252211,
1.4670307626366115,
1.4599103316162523,
1.45273959485914,
1.445565137158368,
1.4340878115214444,
1.418273806730323,
1.3981208704326855,
1.3779439775587023,
1.3598972673004606,
1.3439818461440451,
1.3301807335621048,
1.316383926863083,
1.3040618749785347,
1.2932153453410835,
1.2838336236692311,
1.2744470198196236,
1.2650555289898042,
1.2556591482982988,
1.2428104897182262,
1.2264996957264114,
1.2067249615595257,
1.1869318618366975,
1.1672278304018044,
1.1476135933863398,
1.1280891949729075,
1.108555289615659,
1.0895983485572227,
1.0712204022764056,
1.0534217504465226,
1.0356155337864215,
1.017807767853702,
1.0
]
}

View File

@@ -1,7 +1,7 @@
{
"network_name": "EDFA Example Network - P2P",
"elements": [{
"uid": "Site A",
"uid": "Site_A",
"type": "Transceiver",
"metadata": {
"location": {
@@ -15,13 +15,15 @@
{
"uid": "Span1",
"type": "Fiber",
"type_variety": "SSMF",
"params": {
"length": 80,
"loss_coef": 0.2,
"length_units": "km",
"dispersion": 16.7E-6,
"gamma": 1.27E-3
},
"att_in": 0,
"con_in": 0.5,
"con_out": 0.5
},
"metadata": {
"location": {
"region": "",
@@ -33,11 +35,12 @@
{
"uid": "Edfa1",
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": 16,
"tilt_target": 0
"gain_target": 17,
"tilt_target": 0,
"out_voa": 0
},
"config_from_json": "edfa_config.json",
"metadata": {
"location": {
"region": "",
@@ -47,13 +50,13 @@
}
},
{
"uid": "Site B",
"uid": "Site_B",
"type": "Transceiver",
"metadata": {
"location": {
"city": "Site B",
"region": "",
"latitude": 3,
"latitude": 2,
"longitude": 0
}
}
@@ -61,7 +64,7 @@
],
"connections": [{
"from_node": "Site A",
"from_node": "Site_A",
"to_node": "Span1"
},
{
@@ -70,7 +73,7 @@
},
{
"from_node": "Edfa1",
"to_node": "Site B"
"to_node": "Site_B"
}
]

View File

@@ -0,0 +1,6 @@
{
"nf_ripple": "NFR_96.txt",
"gain_ripple": "DFG_96.txt",
"dgt": "DGT_96.txt",
"nf_fit_coeff": "pNFfit3.txt"
}

View File

@@ -0,0 +1,300 @@
*********************************************
Amplifier models and configuration
*********************************************
1. Equipment configuration description
#######################################
Equipment description defines equipment types and parameters.
It takes place in the default **eqpt_config.json** file.
By default **gnpy-transmission-example** uses **eqpt_config.json** file and that
can be changed with **-e** or **--equipment** command line parameter.
2. Amplifier parameters and subtypes
#######################################
Several amplifiers can be used by GNpy, so they are defined as an array of equipment parameters in **eqpt_config.json** file.
- *"type_variety"*:
Each amplifier is identified by its unique *"type_variety"*, which is used in the topology files input to reference a specific amplifier. It is a user free defined id.
For each amplifier *type_variety*, specific parameters are describing its attributes and performance:
- *"type_def"*:
Sets the amplifier model that the simulation will use to calculate the ase noise contribution. 5 models are defined with reserved words:
- *"advanced_model"*
- *"variable_gain"*
- *"fixed_gain"*
- *"dual_stage"*
- *"openroadm"*
*see next section for a full description of these models*
- *"advanced_config_from_json"*:
**This parameter is only applicable to the _"advanced_model"_ model**
json file name describing:
- nf_fit_coeff
- f_min/max
- gain_ripple
- nf_ripple
- dgt
*see next section for a full description*
- *"gain_flatmax"*:
amplifier maximum gain in dB before its extended gain range: flat or nominal tilt output.
If gain > gain_flatmax, the amplifier will tilt, based on its dgt function
If gain > gain_flatmax + target_extended_gain, the amplifier output power is reduced to not exceed the extended gain range.
- *"gain_min"*:
amplifier minimum gain in dB.
If gain < gain_min, the amplifier input is automatically padded, which results in
NF += gain_min - gain
- *"p_max"*:
amplifier max output power, full load
Total signal output power will not be allowed beyond this value
- *"nf_min/max"*:
**These parameters are only applicable to the _"variable_gain"_ model**
min & max NF values in dB
NF_min is the amplifier NF @ gain_max
NF_max is the amplifier NF @ gain_min
- *"nf_coef"*:
**This parameter is only applicable to the *"openroadm"* model**
[a, b, c, d] 3rd order polynomial coefficients list to define the incremental OSNR vs Pin
Incremental OSNR is the amplifier OSNR contribution
Pin is the amplifier channel input power defined in a 50GHz bandwidth
Incremental OSNR = a*Pin³ + b*Pin² + c*Pin + d
- *"preamp_variety"*:
**This parameter is only applicable to the _"dual_stage"_ model**
1st stage type_variety
- *"booster_variety"*:
**This parameter is only applicable to the *"dual_stage"* model**
2nd stage type_variety
- *"out_voa_auto"*: true/false
**power_mode only**
**This parameter is only applicable to the *"advanced_model"* and *"variable_gain"* models**
If "out_voa_auto": true, auto_design will chose the output_VOA value that maximizes the amplifier gain within its power capability and therefore minimizes its NF.
- *"allowed_for_design"*: true/false
**auto_design only**
Tells auto_design if this amplifier can be picked for the design (deactivates unwanted amplifiers)
It does not prevent the use of an amplifier if it is placed in the topology input.
.. code-block:: json
{"Edfa": [{
"type_variety": "std_medium_gain",
"type_def": "variable_gain",
"gain_flatmax": 26,
"gain_min": 15,
"p_max": 23,
"nf_min": 6,
"nf_max": 10,
"out_voa_auto": false,
"allowed_for_design": true
},
{
"type_variety": "std_low_gain",
"type_def": "variable_gain",
"gain_flatmax": 16,
"gain_min": 8,
"p_max": 23,
"nf_min": 6.5,
"nf_max": 11,
"out_voa_auto": false,
"allowed_for_design": true
}
]}
3. Amplifier models
#######################################
In an opensource and multi-vendor environnement, it is needed to support different use cases and context. Therefore several models are supported for amplifiers.
5 types of EDFA definition are possible and referenced by the *"type_def"* parameter with the following reserved words:
- *"advanced_model"*
This model is refered as a whitebox model because of the detailed level of knowledge that is required. The amplifier NF model and ripple definition are described by a json file referenced with *"advanced_config_from_json"*: json filename. This json file contains:
- nf_fit_coeff: [a,b,c,d]
3rd order polynomial NF = f(-dg) coeficients list
dg = gain - gain_max
- f_min/max: amplifier frequency range in Hz
- gain_ripple : [...]
amplifier gain ripple excursion comb list in dB across the frequency range.
- nf_ripple : [...]
amplifier nf ripple excursion comb list in dB across the frequency range.
- dgt : [...]
amplifier dynamic gain tilt comb list across the frequency range.
*See next section for the generation of this json file*
.. code-block:: json-object
"Edfa":[{
"type_variety": "high_detail_model_example",
"type_def": "advanced_model",
"gain_flatmax": 25,
"gain_min": 15,
"p_max": 21,
"advanced_config_from_json": "std_medium_gain_advanced_config.json",
"out_voa_auto": false,
"allowed_for_design": false
}
]
- *"variable_gain"*
This model is refered as an operator model because a lower level of knowledge is required. A full polynomial description of the NF cross the gain range is not required. Instead, NF_min and NF_max values are required and used by the code to model a dual stage amplifier with an internal mid stage VOA. NF_min and NF_max values are typically available from equipment suppliers data-sheet.
There is a default JSON file ”default_edfa_config.json”* to enforce 0 tilt and ripple values because GNpy core algorithm is a multi-carrier propogation.
- gain_ripple =[0,...,0]
- nf_ripple = [0,...,0]
- dgt = [...] generic dgt comb
.. code-block:: json-object
"Edfa":[{
"type_variety": "std_medium_gain",
"type_def": "variable_gain",
"gain_flatmax": 26,
"gain_min": 15,
"p_max": 23,
"nf_min": 6,
"nf_max": 10,
"out_voa_auto": false,
"allowed_for_design": true
}
]
- *"fixed_gain"*
This model is also an operator model with a single NF value that emulates basic single coil amplifiers without internal VOA.
if gain_min < gain < gain_max, NF == nf0
if gain < gain_min, the amplifier input is automatically padded, which results in
NF += gain_min - gain
.. code-block:: json-object
"Edfa":[{
"type_variety": "std_fixed_gain",
"type_def": "fixed_gain",
"gain_flatmax": 21,
"gain_min": 20,
"p_max": 21,
"nf0": 5.5,
"allowed_for_design": false
}
]
- *"openroadm"*
This model is a black box model replicating OpenRoadm MSA spec for ILA.
.. code-block:: json-object
"Edfa":[{
"type_variety": "low_noise",
"type_def": "openroadm",
"gain_flatmax": 27,
"gain_min": 12,
"p_max": 22,
"nf_coef": [-8.104e-4,-6.221e-2,-5.889e-1,37.62],
"allowed_for_design": false
}
]
- *"dual_stage"*
This model allows the cascade (pre-defined combination) of any 2 amplifiers already described in the eqpt_config.json library.
- preamp_variety defines the 1st stge type variety
- booster variety defines the 2nd stage type variety
Both preamp and booster variety must exist in the eqpt libray
The resulting NF is the sum of the 2 amplifiers
The preamp is operated to its maximum gain
- gain_min indicates to auto_design when this dual_stage should be used
But unlike other models the 1st stage input will not be padded: it is always operated to its maximu gain and min NF. Therefore if gain adaptation and padding is needed it will be performed by the 2nd stage.
.. code-block:: json
{
"type_variety": "medium+low_gain",
"type_def": "dual_stage",
"gain_min": 25,
"preamp_variety": "std_medium_gain",
"booster_variety": "std_low_gain",
"allowed_for_design": true
}
4. advanced_config_from_json
#######################################
The build_oa_json.py library in ``gnpy/example-data/edfa_model/`` can be used to build the json file required for the amplifier advanced_model type_def:
Update an existing json file with all the 96ch txt files for a given amplifier type
amplifier type 'OA_type1' is hard coded but can be modified and other types added
returns an updated amplifier json file: output_json_file_name = 'edfa_config.json'
amplifier file names
Convert a set of amplifier files + input json definiton file into a valid edfa_json_file:
nf_fit_coeff: NF 3rd order polynomial coefficients txt file
nf = f(dg) with dg = gain_operational - gain_max
nf_ripple: NF ripple excursion txt file
gain_ripple: gain ripple txt file
dgt: dynamic gain txt file
input json file in argument (defult = 'OA.json')
the json input file should have the following fields:
.. code-block:: json
{
"nf_fit_coeff": "nf_filename.txt",
"nf_ripple": "nf_ripple_filename.txt",
"gain_ripple": "DFG_filename.txt",
"dgt": "DGT_filename.txt"
}

View File

@@ -0,0 +1,89 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Jan 30 12:32:00 2018
@author: jeanluc-auge
update an existing json file with all the 96ch txt files for a given amplifier type
amplifier type 'OA_type1' is hard coded but can be modified and other types added
returns an updated amplifier json file: output_json_file_name = 'edfa_config.json'
"""
import re
import sys
import json
import numpy as np
"""amplifier file names
convert a set of amplifier files + input json definiton file into a valid edfa_json_file:
nf_fit_coeff: NF 3rd order polynomial coefficients txt file
nf = f(dg)
with dg = gain_operational - gain_max
nf_ripple: NF ripple excursion txt file
gain_ripple: gain ripple txt file
dgt: dynamic gain txt file
input json file in argument (defult = 'OA.json')
the json input file should have the following fields:
{
"nf_fit_coeff": "nf_filename.txt",
"nf_ripple": "nf_ripple_filename.txt",
"gain_ripple": "DFG_filename.txt",
"dgt": "DGT_filename.txt",
}
"""
input_json_file_name = "OA.json" # default path
output_json_file_name = "default_edfa_config.json"
gain_ripple_field = "gain_ripple"
nf_ripple_field = "nf_ripple"
nf_fit_coeff = "nf_fit_coeff"
def read_file(field, file_name):
"""read and format the 96 channels txt files describing the amplifier NF and ripple
convert dfg into gain ripple by removing the mean component
"""
# with open(path + file_name,'r') as this_file:
# data = this_file.read()
# data.strip()
#data = re.sub(r"([0-9])([ ]{1,3})([0-9-+])",r"\1,\3",data)
#data = list(data.split(","))
#data = [float(x) for x in data]
data = np.loadtxt(file_name)
print(len(data), file_name)
if field == gain_ripple_field or field == nf_ripple_field:
# consider ripple excursion only to avoid redundant information
# because the max flat_gain is already given by the 'gain_flat' field in json
# remove the mean component
print(file_name, ', mean value =', data.mean(), ' is substracted')
data = data - data.mean()
data = data.tolist()
return data
def input_json(path):
"""read the json input file and add all the 96 channels txt files
create the output json file with output_json_file_name"""
with open(path, 'r') as edfa_json_file:
amp_text = edfa_json_file.read()
amp_dict = json.loads(amp_text)
for k, v in amp_dict.items():
if re.search(r'.txt$', str(v)):
amp_dict[k] = read_file(k, v)
amp_text = json.dumps(amp_dict, indent=4)
# print(amp_text)
with open(output_json_file_name, 'w') as edfa_json_file:
edfa_json_file.write(amp_text)
if __name__ == '__main__':
if len(sys.argv) == 2:
path = sys.argv[1]
else:
path = input_json_file_name
input_json(path)

View File

@@ -0,0 +1,312 @@
{ "Edfa":[{
"type_variety": "high_detail_model_example",
"type_def": "advanced_model",
"gain_flatmax": 25,
"gain_min": 15,
"p_max": 21,
"advanced_config_from_json": "std_medium_gain_advanced_config.json",
"out_voa_auto": false,
"allowed_for_design": false
}, {
"type_variety": "Juniper_BoosterHG",
"type_def": "advanced_model",
"gain_flatmax": 25,
"gain_min": 10,
"p_max": 21,
"advanced_config_from_json": "Juniper-BoosterHG.json",
"out_voa_auto": false,
"allowed_for_design": false
},
{
"type_variety": "operator_model_example",
"type_def": "variable_gain",
"gain_flatmax": 26,
"gain_min": 15,
"p_max": 23,
"nf_min": 6,
"nf_max": 10,
"out_voa_auto": false,
"allowed_for_design": false
},
{
"type_variety": "low_noise",
"type_def": "openroadm",
"gain_flatmax": 27,
"gain_min": 12,
"p_max": 22,
"nf_coef": [-8.104e-4,-6.221e-2,-5.889e-1,37.62],
"allowed_for_design": false
},
{
"type_variety": "standard",
"type_def": "openroadm",
"gain_flatmax": 27,
"gain_min": 12,
"p_max": 22,
"nf_coef": [-5.952e-4,-6.250e-2,-1.071,28.99],
"allowed_for_design": false
},
{
"type_variety": "std_high_gain",
"type_def": "variable_gain",
"gain_flatmax": 35,
"gain_min": 25,
"p_max": 21,
"nf_min": 5.5,
"nf_max": 7,
"out_voa_auto": false,
"allowed_for_design": true
},
{
"type_variety": "std_medium_gain",
"type_def": "variable_gain",
"gain_flatmax": 26,
"gain_min": 15,
"p_max": 23,
"nf_min": 6,
"nf_max": 10,
"out_voa_auto": false,
"allowed_for_design": true
},
{
"type_variety": "std_low_gain",
"type_def": "variable_gain",
"gain_flatmax": 16,
"gain_min": 8,
"p_max": 23,
"nf_min": 6.5,
"nf_max": 11,
"out_voa_auto": false,
"allowed_for_design": true
},
{
"type_variety": "high_power",
"type_def": "variable_gain",
"gain_flatmax": 16,
"gain_min": 8,
"p_max": 25,
"nf_min": 9,
"nf_max": 15,
"out_voa_auto": false,
"allowed_for_design": false
},
{
"type_variety": "std_fixed_gain",
"type_def": "fixed_gain",
"gain_flatmax": 21,
"gain_min": 20,
"p_max": 21,
"nf0": 5.5,
"allowed_for_design": false
},
{
"type_variety": "4pumps_raman",
"type_def": "fixed_gain",
"gain_flatmax": 12,
"gain_min": 12,
"p_max": 21,
"nf0": -1,
"allowed_for_design": false
},
{
"type_variety": "hybrid_4pumps_lowgain",
"type_def": "dual_stage",
"raman": true,
"gain_min": 25,
"preamp_variety": "4pumps_raman",
"booster_variety": "std_low_gain",
"allowed_for_design": true
},
{
"type_variety": "hybrid_4pumps_mediumgain",
"type_def": "dual_stage",
"raman": true,
"gain_min": 25,
"preamp_variety": "4pumps_raman",
"booster_variety": "std_medium_gain",
"allowed_for_design": true
},
{
"type_variety": "medium+low_gain",
"type_def": "dual_stage",
"gain_min": 25,
"preamp_variety": "std_medium_gain",
"booster_variety": "std_low_gain",
"allowed_for_design": true
},
{
"type_variety": "medium+high_power",
"type_def": "dual_stage",
"gain_min": 25,
"preamp_variety": "std_medium_gain",
"booster_variety": "high_power",
"allowed_for_design": false
}
],
"Fiber":[{
"type_variety": "SSMF",
"dispersion": 1.67e-05,
"gamma": 0.00127,
"pmd_coef": 1.265e-15
},
{
"type_variety": "NZDF",
"dispersion": 0.5e-05,
"gamma": 0.00146,
"pmd_coef": 1.265e-15
},
{
"type_variety": "LOF",
"dispersion": 2.2e-05,
"gamma": 0.000843,
"pmd_coef": 1.265e-15
}
],
"RamanFiber":[{
"type_variety": "SSMF",
"dispersion": 1.67e-05,
"gamma": 0.00127,
"pmd_coef": 1.265e-15,
"raman_efficiency": {
"cr":[
0, 9.4E-06, 2.92E-05, 4.88E-05, 6.82E-05, 8.31E-05, 9.4E-05, 0.0001014, 0.0001069, 0.0001119,
0.0001217, 0.0001268, 0.0001365, 0.000149, 0.000165, 0.000181, 0.0001977, 0.0002192, 0.0002469,
0.0002749, 0.0002999, 0.0003206, 0.0003405, 0.0003592, 0.000374, 0.0003826, 0.0003841, 0.0003826,
0.0003802, 0.0003756, 0.0003549, 0.0003795, 0.000344, 0.0002933, 0.0002024, 0.0001158, 8.46E-05,
7.14E-05, 6.86E-05, 8.5E-05, 8.93E-05, 9.01E-05, 8.15E-05, 6.67E-05, 4.37E-05, 3.28E-05, 2.96E-05,
2.65E-05, 2.57E-05, 2.81E-05, 3.08E-05, 3.67E-05, 5.85E-05, 6.63E-05, 6.36E-05, 5.5E-05, 4.06E-05,
2.77E-05, 2.42E-05, 1.87E-05, 1.6E-05, 1.4E-05, 1.13E-05, 1.05E-05, 9.8E-06, 9.8E-06, 1.13E-05,
1.64E-05, 1.95E-05, 2.38E-05, 2.26E-05, 2.03E-05, 1.48E-05, 1.09E-05, 9.8E-06, 1.05E-05, 1.17E-05,
1.25E-05, 1.21E-05, 1.09E-05, 9.8E-06, 8.2E-06, 6.6E-06, 4.7E-06, 2.7E-06, 1.9E-06, 1.2E-06, 4E-07,
2E-07, 1E-07
],
"frequency_offset":[
0, 0.5e12, 1e12, 1.5e12, 2e12, 2.5e12, 3e12, 3.5e12, 4e12, 4.5e12, 5e12, 5.5e12, 6e12, 6.5e12, 7e12,
7.5e12, 8e12, 8.5e12, 9e12, 9.5e12, 10e12, 10.5e12, 11e12, 11.5e12, 12e12, 12.5e12, 12.75e12,
13e12, 13.25e12, 13.5e12, 14e12, 14.5e12, 14.75e12, 15e12, 15.5e12, 16e12, 16.5e12, 17e12,
17.5e12, 18e12, 18.25e12, 18.5e12, 18.75e12, 19e12, 19.5e12, 20e12, 20.5e12, 21e12, 21.5e12,
22e12, 22.5e12, 23e12, 23.5e12, 24e12, 24.5e12, 25e12, 25.5e12, 26e12, 26.5e12, 27e12, 27.5e12, 28e12,
28.5e12, 29e12, 29.5e12, 30e12, 30.5e12, 31e12, 31.5e12, 32e12, 32.5e12, 33e12, 33.5e12, 34e12, 34.5e12,
35e12, 35.5e12, 36e12, 36.5e12, 37e12, 37.5e12, 38e12, 38.5e12, 39e12, 39.5e12, 40e12, 40.5e12, 41e12,
41.5e12, 42e12
]
}
}
],
"Span":[{
"power_mode":true,
"delta_power_range_db": [-2,3,0.5],
"max_fiber_lineic_loss_for_raman": 0.25,
"target_extended_gain": 2.5,
"max_length": 150,
"length_units": "km",
"max_loss": 28,
"padding": 10,
"EOL": 0,
"con_in": 0,
"con_out": 0
}
],
"Roadm":[{
"target_pch_out_db": -20,
"add_drop_osnr": 38,
"pmd": 0,
"restrictions": {
"preamp_variety_list":[],
"booster_variety_list":[]
}
}],
"SI":[{
"f_min": 191.3e12,
"baud_rate": 32e9,
"f_max":195.1e12,
"spacing": 50e9,
"power_dbm": 0,
"power_range_db": [0,0,1],
"roll_off": 0.15,
"tx_osnr": 40,
"sys_margins": 2
}],
"Transceiver":[
{
"type_variety": "vendorA_trx-type1",
"frequency":{
"min": 191.35e12,
"max": 196.1e12
},
"mode":[
{
"format": "mode 1",
"baud_rate": 32e9,
"OSNR": 11,
"bit_rate": 100e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 37.5e9,
"cost":1
},
{
"format": "mode 2",
"baud_rate": 66e9,
"OSNR": 15,
"bit_rate": 200e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 75e9,
"cost":1
}
]
},
{
"type_variety": "Voyager",
"frequency":{
"min": 191.35e12,
"max": 196.1e12
},
"mode":[
{
"format": "mode 1",
"baud_rate": 32e9,
"OSNR": 12,
"bit_rate": 100e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 37.5e9,
"cost":1
},
{
"format": "mode 3",
"baud_rate": 44e9,
"OSNR": 18,
"bit_rate": 300e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 62.5e9,
"cost":1
},
{
"format": "mode 2",
"baud_rate": 66e9,
"OSNR": 21,
"bit_rate": 400e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 75e9,
"cost":1
},
{
"format": "mode 4",
"baud_rate": 66e9,
"OSNR": 16,
"bit_rate": 200e9,
"roll_off": 0.15,
"tx_osnr": 40,
"min_spacing": 75e9,
"cost":1
}
]
}
]
}

View File

@@ -0,0 +1,196 @@
{
"elements": [
{
"uid": "trx Site_A",
"metadata": {
"location": {
"city": "Site_A",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Transceiver"
},
{
"uid": "trx Site_C",
"metadata": {
"location": {
"city": "Site_C",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Transceiver"
},
{
"uid": "roadm Site_A",
"metadata": {
"location": {
"city": "Site_A",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Roadm",
"params": {
"loss": 17
}
},
{
"uid": "roadm Site_C",
"metadata": {
"location": {
"city": "Site_C",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Roadm"
},
{
"uid": "ingress fused spans in Site_B",
"metadata": {
"location": {
"city": "Site_B",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Fused",
"params": {
"loss": 0.5
}
},
{
"uid": "egress fused spans in Site_B",
"metadata": {
"location": {
"city": "Site_B",
"region": "",
"latitude": 0,
"longitude": 0
}
},
"type": "Fused"
},
{
"uid": "fiber (Site_A \u2192 Site_B)-",
"metadata": {
"location": {
"latitude": 0.0,
"longitude": 0.0
}
},
"type": "Fiber",
"type_variety": "SSMF",
"params": {
"length": 40.0,
"length_units": "km",
"loss_coef": 0.2
}
},
{
"uid": "fiber (Site_B \u2192 Site_C)-",
"metadata": {
"location": {
"latitude": 0.0,
"longitude": 0.0
}
},
"type": "Fiber",
"type_variety": "SSMF",
"params": {
"length": 50.0,
"length_units": "km",
"loss_coef": 0.2
}
},
{
"uid": "fiber (Site_B \u2192 Site_A)-",
"metadata": {
"location": {
"latitude": 0.0,
"longitude": 0.0
}
},
"type": "Fiber",
"type_variety": "SSMF",
"params": {
"length": 40.0,
"length_units": "km",
"loss_coef": 0.2
}
},
{
"uid": "fiber (Site_C \u2192 Site_B)-",
"metadata": {
"location": {
"latitude": 0.0,
"longitude": 0.0
}
},
"type": "Fiber",
"type_variety": "SSMF",
"params": {
"length": 50.0,
"length_units": "km",
"loss_coef": 0.2
}
}
],
"connections": [
{
"from_node": "roadm Site_A",
"to_node": "fiber (Site_A \u2192 Site_B)-"
},
{
"from_node": "fiber (Site_B \u2192 Site_A)-",
"to_node": "roadm Site_A"
},
{
"from_node": "fiber (Site_A \u2192 Site_B)-",
"to_node": "ingress fused spans in Site_B"
},
{
"from_node": "ingress fused spans in Site_B",
"to_node": "fiber (Site_B \u2192 Site_C)-"
},
{
"from_node": "fiber (Site_C \u2192 Site_B)-",
"to_node": "egress fused spans in Site_B"
},
{
"from_node": "egress fused spans in Site_B",
"to_node": "fiber (Site_B \u2192 Site_A)-"
},
{
"from_node": "roadm Site_C",
"to_node": "fiber (Site_C \u2192 Site_B)-"
},
{
"from_node": "fiber (Site_B \u2192 Site_C)-",
"to_node": "roadm Site_C"
},
{
"from_node": "trx Site_A",
"to_node": "roadm Site_A"
},
{
"from_node": "roadm Site_A",
"to_node": "trx Site_A"
},
{
"from_node": "trx Site_C",
"to_node": "roadm Site_C"
},
{
"from_node": "roadm Site_C",
"to_node": "trx Site_C"
}
]
}

Binary file not shown.

File diff suppressed because it is too large Load Diff

Binary file not shown.

View File

@@ -0,0 +1,268 @@
{
"path-request": [
{
"request-id": "0",
"source": "trx Lorient_KMA",
"destination": "trx Vannes_KBE",
"src-tp-id": "trx Lorient_KMA",
"dst-tp-id": "trx Vannes_KBE",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "Voyager",
"trx_mode": null,
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 50000000000.0,
"max-nb-of-channel": 80,
"output-power": 0.0012589254117941673,
"path_bandwidth": 100000000000.0
}
}
},
{
"request-id": "1",
"source": "trx Brest_KLA",
"destination": "trx Vannes_KBE",
"src-tp-id": "trx Brest_KLA",
"dst-tp-id": "trx Vannes_KBE",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "Voyager",
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 50000000000.0,
"max-nb-of-channel": null,
"output-power": 0.0012589254117941673,
"path_bandwidth": 200000000000.0
}
},
"explicit-route-objects": {
"route-object-include-exclude": [
{
"explicit-route-usage": "route-include-ero",
"index": 0,
"num-unnum-hop": {
"node-id": "roadm Brest_KLA",
"link-tp-id": "link-tp-id is not used",
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 1,
"num-unnum-hop": {
"node-id": "roadm Lannion_CAS",
"link-tp-id": "link-tp-id is not used",
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 2,
"num-unnum-hop": {
"node-id": "roadm Lorient_KMA",
"link-tp-id": "link-tp-id is not used",
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 3,
"num-unnum-hop": {
"node-id": "roadm Vannes_KBE",
"link-tp-id": "link-tp-id is not used",
"hop-type": "LOOSE"
}
}
]
}
},
{
"request-id": "3",
"source": "trx Lannion_CAS",
"destination": "trx Rennes_STA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Rennes_STA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "vendorA_trx-type1",
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 50000000000.0,
"max-nb-of-channel": null,
"output-power": null,
"path_bandwidth": 60000000000.0
}
}
},
{
"request-id": "4",
"source": "trx Rennes_STA",
"destination": "trx Lannion_CAS",
"src-tp-id": "trx Rennes_STA",
"dst-tp-id": "trx Lannion_CAS",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "vendorA_trx-type1",
"trx_mode": null,
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 75000000000.0,
"max-nb-of-channel": null,
"output-power": 0.0019952623149688794,
"path_bandwidth": 150000000000.0
}
}
},
{
"request-id": "5",
"source": "trx Rennes_STA",
"destination": "trx Lannion_CAS",
"src-tp-id": "trx Rennes_STA",
"dst-tp-id": "trx Lannion_CAS",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "vendorA_trx-type1",
"trx_mode": "mode 2",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 75000000000.0,
"max-nb-of-channel": 63,
"output-power": 0.0019952623149688794,
"path_bandwidth": 20000000000.0
}
}
},
{
"request-id": "6",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "Voyager",
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 50000000000.0,
"max-nb-of-channel": 76,
"output-power": 0.001,
"path_bandwidth": 300000000000.0
}
}
},
{
"request-id": "7",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "Voyager",
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 50000000000.0,
"max-nb-of-channel": 76,
"output-power": 0.001,
"path_bandwidth": 400000000000.0
}
}
},
{
"request-id": "7b",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
"trx_type": "Voyager",
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"N": "null",
"M": "null"
}
],
"spacing": 75000000000.0,
"max-nb-of-channel": 50,
"output-power": 0.001,
"path_bandwidth": 400000000000.0
}
}
}
],
"synchronization": [
{
"synchronization-id": "3",
"svec": {
"relaxable": "false",
"disjointness": "node link",
"request-id-number": [
"3",
"1"
]
}
},
{
"synchronization-id": "4",
"svec": {
"relaxable": "false",
"disjointness": "node link",
"request-id-number": [
"4",
"5"
]
}
}
]
}

View File

@@ -0,0 +1,98 @@
{
"elements": [
{
"uid": "Site_A",
"type": "Transceiver",
"metadata": {
"location": {
"latitude": 0,
"longitude": 0,
"city": "Site A",
"region": ""
}
}
},
{
"uid": "Span1",
"type": "RamanFiber",
"type_variety": "SSMF",
"operational": {
"temperature": 283,
"raman_pumps": [
{
"power": 200e-3,
"frequency": 205e12,
"propagation_direction": "counterprop"
},
{
"power": 206e-3,
"frequency": 201e12,
"propagation_direction": "counterprop"
}
]
},
"params": {
"type_variety": "SSMF",
"length": 80.0,
"loss_coef": 0.2,
"length_units": "km",
"att_in": 0,
"con_in": 0.5,
"con_out": 0.5
},
"metadata": {
"location": {
"latitude": 1,
"longitude": 0,
"city": null,
"region": ""
}
}
},
{
"uid": "Edfa1",
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": 15.0,
"delta_p": -2,
"tilt_target": 0,
"out_voa": 0
},
"metadata": {
"location": {
"latitude": 2,
"longitude": 0,
"city": null,
"region": ""
}
}
},
{
"uid": "Site_B",
"type": "Transceiver",
"metadata": {
"location": {
"latitude": 2,
"longitude": 0,
"city": "Site B",
"region": ""
}
}
}
],
"connections": [
{
"from_node": "Site_A",
"to_node": "Span1"
},
{
"from_node": "Span1",
"to_node": "Edfa1"
},
{
"from_node": "Edfa1",
"to_node": "Site_B"
}
]
}

View File

@@ -0,0 +1,14 @@
{
"raman_parameters": {
"flag_raman": true,
"space_resolution": 10e3,
"tolerance": 1e-8
},
"nli_parameters": {
"nli_method_name": "ggn_spectrally_separated",
"wdm_grid_size": 50e9,
"dispersion_tolerance": 1,
"phase_shift_tolerance": 0.1,
"computed_channels": [1, 18, 37, 56, 75]
}
}

View File

@@ -0,0 +1,303 @@
{ "nf_fit_coeff": [
0.000168241,
0.0469961,
0.0359549,
5.82851
],
"f_min": 191.35e12,
"f_max": 196.1e12,
"nf_ripple": [
-0.3110761646066259,
-0.3110761646066259,
-0.31110274831665313,
-0.31419329378173544,
-0.3172854168606314,
-0.32037911876162584,
-0.3233255190215882,
-0.31624321721895354,
-0.30915729645781326,
-0.30206775396360075,
-0.2949045115165272,
-0.26632156113294336,
-0.23772399031437283,
-0.20911178784023846,
-0.18048410390821285,
-0.14379944379052215,
-0.10709599992470213,
-0.07037375788020579,
-0.03372858157230583,
-0.015660302006048,
0.0024172385953583004,
0.020504047353947653,
0.03860013139908377,
0.05670549786742816,
0.07482015390297145,
0.0838762040768461,
0.09284481475528361,
0.1018180306253394,
0.11079585523492333,
0.1020395478432815,
0.09310160456603413,
0.08415906712621996,
0.07521193198077789,
0.0676340601339394,
0.06005437964543287,
0.052470799141237305,
0.044883315610536455,
0.037679759069084225,
0.03047647598902483,
0.02326948274513522,
0.01605877647020772,
0.021248462316134083,
0.02657315875107553,
0.03190060058247842,
0.03723078993416436,
0.04256372893215024,
0.047899419704645264,
0.03915515813685565,
0.030289222542492025,
0.021418708618354456,
0.012573926129294415,
0.006240488799898697,
-9.622162373026585e-05,
-0.006436207679519103,
-0.012779471908040341,
-0.02038153550619876,
-0.027999803010447587,
-0.035622012697103154,
-0.043236398934156144,
-0.04493583574805963,
-0.04663615264317309,
-0.048337350303318156,
-0.050039429413028365,
-0.051742390657545205,
-0.05342028484370278,
-0.05254242298580185,
-0.05166410580536087,
-0.05078533294804249,
-0.04990610405914272,
-0.05409792133358102,
-0.05832916277634124,
-0.06256260169582961,
-0.06660356886269536,
-0.04779792991567815,
-0.028982516728038848,
-0.010157321677553965,
0.00861320615127981,
0.01913736978785662,
0.029667009055877668,
0.04020212822983975,
0.050742731588695494,
0.061288823415841555,
0.07184040799914815,
0.1043252636301016,
0.13687829834471027,
0.1694483010211072,
0.202035284929368,
0.23624619427167134,
0.27048596623174515,
0.30474360397422756,
0.3390191214858807,
0.36358851509924695,
0.38814205928193013,
0.41270842850729195,
0.4372876328262819,
0.4372876328262819
],
"dgt": [
2.714526681131686,
2.705443819238505,
2.6947834587664494,
2.6841217449620203,
2.6681935771243177,
2.6521732021128046,
2.630396440815385,
2.602860350286428,
2.5696460593920065,
2.5364027376452056,
2.499446286796604,
2.4587748041127506,
2.414398437185221,
2.3699990328716107,
2.322373696229342,
2.271520771371253,
2.2174389328192197,
2.16337565384239,
2.1183028432496016,
2.082225099873648,
2.055100772005235,
2.0279625371819305,
2.0008103857988204,
1.9736443063300082,
1.9482128147680253,
1.9245345552113182,
1.9026104247588487,
1.8806927939516411,
1.862235672444246,
1.847275503201129,
1.835814081380705,
1.824381436842932,
1.8139629377087627,
1.8045606557581335,
1.7961751115773796,
1.7877868031023945,
1.7793941781790852,
1.7709972329654864,
1.7625959636196327,
1.7541903672600494,
1.7459181197626403,
1.737780757913635,
1.7297783508684146,
1.7217732861435076,
1.7137640932265894,
1.7057507692361864,
1.6918150918099673,
1.6719047669939942,
1.6460167077689267,
1.6201194134191075,
1.5986915141218316,
1.5817353179379183,
1.569199764184379,
1.5566577309558969,
1.545374152761467,
1.5353620432989845,
1.5266220576235803,
1.5178910621476225,
1.5097346239790443,
1.502153039909686,
1.495145456062699,
1.488134243479226,
1.48111939735681,
1.474100442252211,
1.4670307626366115,
1.4599103316162523,
1.45273959485914,
1.445565137158368,
1.4340878115214444,
1.418273806730323,
1.3981208704326855,
1.3779439775587023,
1.3598972673004606,
1.3439818461440451,
1.3301807335621048,
1.316383926863083,
1.3040618749785347,
1.2932153453410835,
1.2838336236692311,
1.2744470198196236,
1.2650555289898042,
1.2556591482982988,
1.2428104897182262,
1.2264996957264114,
1.2067249615595257,
1.1869318618366975,
1.1672278304018044,
1.1476135933863398,
1.1280891949729075,
1.108555289615659,
1.0895983485572227,
1.0712204022764056,
1.0534217504465226,
1.0356155337864215,
1.017807767853702,
1.0
],
"gain_ripple": [
0.1359703369791596,
0.11822862697916037,
0.09542181697916163,
0.06245819697916133,
0.02602813697916062,
-0.0036199830208403228,
-0.018326963020840026,
-0.0246928330208398,
-0.016792253020838643,
-0.0028138630208403015,
0.017572956979162058,
0.038328296979159404,
0.054956336979159914,
0.0670723869791594,
0.07091459697916136,
0.07094413697916124,
0.07114372697916238,
0.07533675697916209,
0.08731066697916035,
0.10313984697916112,
0.12276252697916235,
0.14239527697916188,
0.15945681697916214,
0.1739275269791598,
0.1767381569791624,
0.17037189697916233,
0.15216302697916007,
0.13114358697916018,
0.10802383697916085,
0.08548825697916129,
0.06916723697916183,
0.05848224697916038,
0.05447361697916264,
0.05154489697916276,
0.04946107697915991,
0.04717897697916129,
0.04551704697916037,
0.04467697697916151,
0.04072968697916224,
0.03285456697916089,
0.023488786979161347,
0.01659282697915998,
0.013321846979160057,
0.011234826979162449,
0.01030063697916006,
0.00936596697916059,
0.00874012697916271,
0.00842583697916055,
0.006965146979162284,
0.0040435869791615175,
0.0007104669791608842,
-0.0015763130208377163,
-0.006936193020838033,
-0.016475303020840215,
-0.028748483020837767,
-0.039618433020837784,
-0.051112303020840244,
-0.06468462302083822,
-0.07868024302083754,
-0.09101254302083817,
-0.10103437302083762,
-0.11041488302083735,
-0.11916081302083725,
-0.12789859302083784,
-0.1353792530208402,
-0.14160178302083892,
-0.1455411330208385,
-0.1484450830208388,
-0.14823350302084037,
-0.14591937302083835,
-0.1409032730208395,
-0.13525493302083902,
-0.1279646530208396,
-0.11963431302083904,
-0.11089282302084058,
-0.1027863830208382,
-0.09717347302083823,
-0.09343261302083761,
-0.0913487130208388,
-0.08906007302083907,
-0.0865687230208394,
-0.08407607302083875,
-0.07844600302084004,
-0.06968090302083851,
-0.05947139302083926,
-0.05095282302083959,
-0.042428283020839785,
-0.03218106302083967,
-0.01819858302084043,
-0.0021726530208390216,
0.01393231697916164,
0.028098946979159933,
0.040326236979161934,
0.05257029697916238,
0.06479749697916048,
0.07704745697916238
]
}

View File

@@ -0,0 +1,35 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
write_path_jsontocsv.py
========================
Reads JSON path result file in accordance with the Yang model for requesting
path computation and writes results to a CSV file.
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from argparse import ArgumentParser
from pathlib import Path
from json import loads
from gnpy.tools.json_io import load_equipment
from gnpy.topology.request import jsontocsv
parser = ArgumentParser(description='A function that writes json path results in an excel sheet.')
parser.add_argument('filename', nargs='?', type=Path)
parser.add_argument('output_filename', nargs='?', type=Path)
parser.add_argument('eqpt_filename', nargs='?', type=Path, default=Path(__file__).parent / 'eqpt_config.json')
if __name__ == '__main__':
args = parser.parse_args()
with open(args.output_filename, 'w', encoding='utf-8') as file:
with open(args.filename, encoding='utf-8') as f:
print(f'Reading {args.filename}')
json_data = loads(f.read())
equipment = load_equipment(args.eqpt_filename)
print(f'Writing in {args.output_filename}')
jsontocsv(json_data, equipment, file)

5
gnpy/tools/__init__.py Normal file
View File

@@ -0,0 +1,5 @@
'''
Processing of data via :py:mod:`.json_io`.
Utilities for Excel conversion in :py:mod:`.convert` and :py:mod:`.service_sheet`.
Example code in :py:mod:`.cli_examples` and :py:mod:`.plots`.
'''

433
gnpy/tools/cli_examples.py Normal file
View File

@@ -0,0 +1,433 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.cli_examples
=======================
Common code for CLI examples
'''
import argparse
from json import dumps
import logging
import os.path
import sys
from math import ceil
from numpy import linspace, mean
from pathlib import Path
import gnpy.core.ansi_escapes as ansi_escapes
from gnpy.core.elements import Transceiver, Fiber, RamanFiber
from gnpy.core.equipment import trx_mode_params
import gnpy.core.exceptions as exceptions
from gnpy.core.network import build_network
from gnpy.core.parameters import SimParams
from gnpy.core.science_utils import Simulation
from gnpy.core.utils import db2lin, lin2db, automatic_nch
from gnpy.topology.request import (ResultElement, jsontocsv, compute_path_dsjctn, requests_aggregation,
BLOCKING_NOPATH, correct_json_route_list,
deduplicate_disjunctions, compute_path_with_disjunction,
PathRequest, compute_constrained_path, propagate2)
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
from gnpy.tools.json_io import load_equipment, load_network, load_json, load_requests, save_network, \
requests_from_json, disjunctions_from_json, save_json
from gnpy.tools.plots import plot_baseline, plot_results
_logger = logging.getLogger(__name__)
_examples_dir = Path(__file__).parent.parent / 'example-data'
_help_footer = '''
This program is part of GNPy, https://github.com/TelecomInfraProject/oopt-gnpy
Learn more at https://gnpy.readthedocs.io/
'''
_help_fname_json = 'FILE.json'
_help_fname_json_csv = 'FILE.(json|csv)'
def show_example_data_dir():
print(f'{_examples_dir}/')
def load_common_data(equipment_filename, topology_filename, simulation_filename, save_raw_network_filename):
'''Load common configuration from JSON files'''
try:
equipment = load_equipment(equipment_filename)
network = load_network(topology_filename, equipment)
if save_raw_network_filename is not None:
save_network(network, save_raw_network_filename)
print(f'{ansi_escapes.blue}Raw network (no optimizations) saved to {save_raw_network_filename}{ansi_escapes.reset}')
sim_params = SimParams(**load_json(simulation_filename)) if simulation_filename is not None else None
if not sim_params:
if next((node for node in network if isinstance(node, RamanFiber)), None) is not None:
print(f'{ansi_escapes.red}Invocation error:{ansi_escapes.reset} '
f'RamanFiber requires passing simulation params via --sim-params')
sys.exit(1)
else:
Simulation.set_params(sim_params)
except exceptions.EquipmentConfigError as e:
print(f'{ansi_escapes.red}Configuration error in the equipment library:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.NetworkTopologyError as e:
print(f'{ansi_escapes.red}Invalid network definition:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ConfigurationError as e:
print(f'{ansi_escapes.red}Configuration error:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ParametersError as e:
print(f'{ansi_escapes.red}Simulation parameters error:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ServiceError as e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
sys.exit(1)
return (equipment, network)
def _setup_logging(args):
logging.basicConfig(level={2: logging.DEBUG, 1: logging.INFO, 0: logging.CRITICAL}.get(args.verbose, logging.DEBUG))
def _add_common_options(parser: argparse.ArgumentParser, network_default: Path):
parser.add_argument('topology', nargs='?', type=Path, metavar='NETWORK-TOPOLOGY.(json|xls|xlsx)',
default=network_default,
help='Input network topology')
parser.add_argument('-v', '--verbose', action='count', default=0,
help='Increase verbosity (can be specified several times)')
parser.add_argument('-e', '--equipment', type=Path, metavar=_help_fname_json,
default=_examples_dir / 'eqpt_config.json', help='Equipment library')
parser.add_argument('--sim-params', type=Path, metavar=_help_fname_json,
default=None, help='Path to the JSON containing simulation parameters (required for Raman). '
f'Example: {_examples_dir / "sim_params.json"}')
parser.add_argument('--save-network', type=Path, metavar=_help_fname_json,
help='Save the final network as a JSON file')
parser.add_argument('--save-network-before-autodesign', type=Path, metavar=_help_fname_json,
help='Dump the network into a JSON file prior to autodesign')
def transmission_main_example(args=None):
parser = argparse.ArgumentParser(
description='Send a full spectrum load through the network from point A to point B',
epilog=_help_footer,
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
)
_add_common_options(parser, network_default=_examples_dir / 'edfa_example_network.json')
parser.add_argument('--show-channels', action='store_true', help='Show final per-channel OSNR summary')
parser.add_argument('-pl', '--plot', action='store_true')
parser.add_argument('-l', '--list-nodes', action='store_true', help='list all transceiver nodes')
parser.add_argument('-po', '--power', default=0, help='channel ref power in dBm')
parser.add_argument('source', nargs='?', help='source node')
parser.add_argument('destination', nargs='?', help='destination node')
args = parser.parse_args(args if args is not None else sys.argv[1:])
_setup_logging(args)
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
if args.plot:
plot_baseline(network)
transceivers = {n.uid: n for n in network.nodes() if isinstance(n, Transceiver)}
if not transceivers:
sys.exit('Network has no transceivers!')
if len(transceivers) < 2:
sys.exit('Network has only one transceiver!')
if args.list_nodes:
for uid in transceivers:
print(uid)
sys.exit()
# First try to find exact match if source/destination provided
if args.source:
source = transceivers.pop(args.source, None)
valid_source = True if source else False
else:
source = None
_logger.info('No source node specified: picking random transceiver')
if args.destination:
destination = transceivers.pop(args.destination, None)
valid_destination = True if destination else False
else:
destination = None
_logger.info('No destination node specified: picking random transceiver')
# If no exact match try to find partial match
if args.source and not source:
# TODO code a more advanced regex to find nodes match
source = next((transceivers.pop(uid) for uid in transceivers
if args.source.lower() in uid.lower()), None)
if args.destination and not destination:
# TODO code a more advanced regex to find nodes match
destination = next((transceivers.pop(uid) for uid in transceivers
if args.destination.lower() in uid.lower()), None)
# If no partial match or no source/destination provided pick random
if not source:
source = list(transceivers.values())[0]
del transceivers[source.uid]
if not destination:
destination = list(transceivers.values())[0]
_logger.info(f'source = {args.source!r}')
_logger.info(f'destination = {args.destination!r}')
params = {}
params['request_id'] = 0
params['trx_type'] = ''
params['trx_mode'] = ''
params['source'] = source.uid
params['destination'] = destination.uid
params['bidir'] = False
params['nodes_list'] = [destination.uid]
params['loose_list'] = ['strict']
params['format'] = ''
params['path_bandwidth'] = 0
trx_params = trx_mode_params(equipment)
if args.power:
trx_params['power'] = db2lin(float(args.power)) * 1e-3
params.update(trx_params)
req = PathRequest(**params)
power_mode = equipment['Span']['default'].power_mode
print('\n'.join([f'Power mode is set to {power_mode}',
f'=> it can be modified in eqpt_config.json - Span']))
pref_ch_db = lin2db(req.power * 1e3) # reference channel power / span (SL=20dB)
pref_total_db = pref_ch_db + lin2db(req.nb_channel) # reference total power / span (SL=20dB)
build_network(network, equipment, pref_ch_db, pref_total_db)
path = compute_constrained_path(network, req)
spans = [s.params.length for s in path if isinstance(s, RamanFiber) or isinstance(s, Fiber)]
print(f'\nThere are {len(spans)} fiber spans over {sum(spans)/1000:.0f} km between {source.uid} '
f'and {destination.uid}')
print(f'\nNow propagating between {source.uid} and {destination.uid}:')
try:
p_start, p_stop, p_step = equipment['SI']['default'].power_range_db
p_num = abs(int(round((p_stop - p_start) / p_step))) + 1 if p_step != 0 else 1
power_range = list(linspace(p_start, p_stop, p_num))
except TypeError:
print('invalid power range definition in eqpt_config, should be power_range_db: [lower, upper, step]')
power_range = [0]
if not power_mode:
# power cannot be changed in gain mode
power_range = [0]
for dp_db in power_range:
req.power = db2lin(pref_ch_db + dp_db) * 1e-3
if power_mode:
print(f'\nPropagating with input power = {ansi_escapes.cyan}{lin2db(req.power*1e3):.2f} dBm{ansi_escapes.reset}:')
else:
print(f'\nPropagating in {ansi_escapes.cyan}gain mode{ansi_escapes.reset}: power cannot be set manually')
infos = propagate2(path, req, equipment)
if len(power_range) == 1:
for elem in path:
print(elem)
if power_mode:
print(f'\nTransmission result for input power = {lin2db(req.power*1e3):.2f} dBm:')
else:
print(f'\nTransmission results:')
print(f' Final SNR total (0.1 nm): {ansi_escapes.cyan}{mean(destination.snr_01nm):.02f} dB{ansi_escapes.reset}')
else:
print(path[-1])
# print(f'\n !!!!!!!!!!!!!!!!! TEST POINT !!!!!!!!!!!!!!!!!!!!!')
# print(f'carriers ase output of {path[1]} =\n {list(path[1].carriers("out", "nli"))}')
# => use "in" or "out" parameter
# => use "nli" or "ase" or "signal" or "total" parameter
if args.save_network is not None:
save_network(network, args.save_network)
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
if args.show_channels:
print('\nThe total SNR per channel at the end of the line is:')
print(
'{:>5}{:>26}{:>26}{:>28}{:>28}{:>28}' .format(
'Ch. #',
'Channel frequency (THz)',
'Channel power (dBm)',
'OSNR ASE (signal bw, dB)',
'SNR NLI (signal bw, dB)',
'SNR total (signal bw, dB)'))
for final_carrier, ch_osnr, ch_snr_nl, ch_snr in zip(
infos[path[-1]][1].carriers, path[-1].osnr_ase, path[-1].osnr_nli, path[-1].snr):
ch_freq = final_carrier.frequency * 1e-12
ch_power = lin2db(final_carrier.power.signal * 1e3)
print(
'{:5}{:26.2f}{:26.2f}{:28.2f}{:28.2f}{:28.2f}' .format(
final_carrier.channel_number, round(
ch_freq, 2), round(
ch_power, 2), round(
ch_osnr, 2), round(
ch_snr_nl, 2), round(
ch_snr, 2)))
if not args.source:
print(f'\n(No source node specified: picked {source.uid})')
elif not valid_source:
print(f'\n(Invalid source node {args.source!r} replaced with {source.uid})')
if not args.destination:
print(f'\n(No destination node specified: picked {destination.uid})')
elif not valid_destination:
print(f'\n(Invalid destination node {args.destination!r} replaced with {destination.uid})')
if args.plot:
plot_results(network, path, source, destination, infos)
def _path_result_json(pathresult):
return {'response': [n.json for n in pathresult]}
def path_requests_run(args=None):
parser = argparse.ArgumentParser(
description='Compute performance for a list of services provided in a json file or an excel sheet',
epilog=_help_footer,
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
)
_add_common_options(parser, network_default=_examples_dir / 'meshTopologyExampleV2.xls')
parser.add_argument('service_filename', nargs='?', type=Path, metavar='SERVICES-REQUESTS.(json|xls|xlsx)',
default=_examples_dir / 'meshTopologyExampleV2.xls',
help='Input service file')
parser.add_argument('-bi', '--bidir', action='store_true',
help='considers that all demands are bidir')
parser.add_argument('-o', '--output', type=Path, metavar=_help_fname_json_csv,
help='Store satisifed requests into a JSON or CSV file')
args = parser.parse_args(args if args is not None else sys.argv[1:])
_setup_logging(args)
_logger.info(f'Computing path requests {args.service_filename} into JSON format')
print(f'{ansi_escapes.blue}Computing path requests {os.path.relpath(args.service_filename)} into JSON format{ansi_escapes.reset}')
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
# Build the network once using the default power defined in SI in eqpt config
# TODO power density: db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
# spacing, f_min and f_max
p_db = equipment['SI']['default'].power_dbm
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
build_network(network, equipment, p_db, p_total_db)
if args.save_network is not None:
save_network(network, args.save_network)
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
oms_list = build_oms_list(network, equipment)
try:
data = load_requests(args.service_filename, equipment, bidir=args.bidir,
network=network, network_filename=args.topology)
rqs = requests_from_json(data, equipment)
except exceptions.ServiceError as e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
sys.exit(1)
# check that request ids are unique. Non unique ids, may
# mess the computation: better to stop the computation
all_ids = [r.request_id for r in rqs]
if len(all_ids) != len(set(all_ids)):
for item in list(set(all_ids)):
all_ids.remove(item)
msg = f'Requests id {all_ids} are not unique'
_logger.critical(msg)
sys.exit()
rqs = correct_json_route_list(network, rqs)
# pths = compute_path(network, equipment, rqs)
dsjn = disjunctions_from_json(data)
print(f'{ansi_escapes.blue}List of disjunctions{ansi_escapes.reset}')
print(dsjn)
# need to warn or correct in case of wrong disjunction form
# disjunction must not be repeated with same or different ids
dsjn = deduplicate_disjunctions(dsjn)
# Aggregate demands with same exact constraints
print(f'{ansi_escapes.blue}Aggregating similar requests{ansi_escapes.reset}')
rqs, dsjn = requests_aggregation(rqs, dsjn)
# TODO export novel set of aggregated demands in a json file
print(f'{ansi_escapes.blue}The following services have been requested:{ansi_escapes.reset}')
print(rqs)
print(f'{ansi_escapes.blue}Computing all paths with constraints{ansi_escapes.reset}')
try:
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
except exceptions.DisjunctionError as this_e:
print(f'{ansi_escapes.red}Disjunction error:{ansi_escapes.reset} {this_e}')
sys.exit(1)
print(f'{ansi_escapes.blue}Propagating on selected path{ansi_escapes.reset}')
propagatedpths, reversed_pths, reversed_propagatedpths = compute_path_with_disjunction(network, equipment, rqs, pths)
# Note that deepcopy used in compute_path_with_disjunction returns
# a list of nodes which are not belonging to network (they are copies of the node objects).
# so there can not be propagation on these nodes.
pth_assign_spectrum(pths, rqs, oms_list, reversed_pths)
print(f'{ansi_escapes.blue}Result summary{ansi_escapes.reset}')
header = ['req id', ' demand', ' snr@bandwidth A-Z (Z-A)', ' snr@0.1nm A-Z (Z-A)',
' Receiver minOSNR', ' mode', ' Gbit/s', ' nb of tsp pairs',
'N,M or blocking reason']
data = []
data.append(header)
for i, this_p in enumerate(propagatedpths):
rev_pth = reversed_propagatedpths[i]
if rev_pth and this_p:
psnrb = f'{round(mean(this_p[-1].snr),2)} ({round(mean(rev_pth[-1].snr),2)})'
psnr = f'{round(mean(this_p[-1].snr_01nm), 2)}' +\
f' ({round(mean(rev_pth[-1].snr_01nm),2)})'
elif this_p:
psnrb = f'{round(mean(this_p[-1].snr),2)}'
psnr = f'{round(mean(this_p[-1].snr_01nm),2)}'
try:
if rqs[i].blocking_reason in BLOCKING_NOPATH:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} :',
f'-', f'-', f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
f'-', f'{rqs[i].blocking_reason}']
else:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
psnr, f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9, 2)}',
f'-', f'{rqs[i].blocking_reason}']
except AttributeError:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
psnr, f'{rqs[i].OSNR}', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
f'{ceil(rqs[i].path_bandwidth / rqs[i].bit_rate) }', f'({rqs[i].N},{rqs[i].M})']
data.append(line)
col_width = max(len(word) for row in data for word in row[2:]) # padding
firstcol_width = max(len(row[0]) for row in data) # padding
secondcol_width = max(len(row[1]) for row in data) # padding
for row in data:
firstcol = ''.join(row[0].ljust(firstcol_width))
secondcol = ''.join(row[1].ljust(secondcol_width))
remainingcols = ''.join(word.center(col_width, ' ') for word in row[2:])
print(f'{firstcol} {secondcol} {remainingcols}')
print(f'{ansi_escapes.yellow}Result summary shows mean SNR and OSNR (average over all channels){ansi_escapes.reset}')
if args.output:
result = []
# assumes that list of rqs and list of propgatedpths have same order
for i, pth in enumerate(propagatedpths):
result.append(ResultElement(rqs[i], pth, reversed_propagatedpths[i]))
temp = _path_result_json(result)
if args.output.suffix.lower() == '.json':
save_json(temp, args.output)
print(f'{ansi_escapes.blue}Saved JSON to {args.output}{ansi_escapes.reset}')
elif args.output.suffix.lower() == '.csv':
with open(args.output, "w", encoding='utf-8') as fcsv:
jsontocsv(temp, equipment, fcsv)
print(f'{ansi_escapes.blue}Saved CSV to {args.output}{ansi_escapes.reset}')
else:
print(f'{ansi_escapes.red}Cannot save output: neither JSON nor CSV file{ansi_escapes.reset}')
sys.exit(1)

745
gnpy/tools/convert.py Executable file
View File

@@ -0,0 +1,745 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.tools.convert
==================
This module contains utilities for converting between XLS and JSON.
The input XLS file must contain sheets named "Nodes" and "Links".
It may optionally contain a sheet named "Eqpt".
In the "Nodes" sheet, only the "City" column is mandatory. The column "Type"
can be determined automatically given the topology (e.g., if degree 2, ILA;
otherwise, ROADM.) Incorrectly specified types (e.g., ILA for node of
degree ≠ 2) will be automatically corrected.
In the "Links" sheet, only the first three columns ("Node A", "Node Z" and
"east Distance (km)") are mandatory. Missing "west" information is copied from
the "east" information so that it is possible to input undirected data.
"""
from sys import exit
from xlrd import open_workbook
from argparse import ArgumentParser
from collections import namedtuple, Counter, defaultdict
from itertools import chain
from json import dumps
from pathlib import Path
from copy import copy
from gnpy.core import ansi_escapes
from gnpy.core.utils import silent_remove
from gnpy.core.exceptions import NetworkTopologyError
from gnpy.core.elements import Edfa, Fused, Fiber
def all_rows(sh, start=0):
return (sh.row(x) for x in range(start, sh.nrows))
class Node(object):
def __init__(self, **kwargs):
super(Node, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v = clean_kwargs.get(k, v)
setattr(self, k, v)
default_values = {
'city': '',
'state': '',
'country': '',
'region': '',
'latitude': 0,
'longitude': 0,
'node_type': 'ILA',
'booster_restriction': '',
'preamp_restriction': ''
}
class Link(object):
"""attribtes from west parse_ept_headers dict
+node_a, node_z, west_fiber_con_in, east_fiber_con_in
"""
def __init__(self, **kwargs):
super(Link, self).__init__()
self.update_attr(kwargs)
self.distance_units = 'km'
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v = clean_kwargs.get(k, v)
setattr(self, k, v)
k = 'west' + k.split('east')[-1]
v = clean_kwargs.get(k, v)
setattr(self, k, v)
def __eq__(self, link):
return (self.from_city == link.from_city and self.to_city == link.to_city) \
or (self.from_city == link.to_city and self.to_city == link.from_city)
default_values = {
'from_city': '',
'to_city': '',
'east_distance': 80,
'east_fiber': 'SSMF',
'east_lineic': 0.2,
'east_con_in': None,
'east_con_out': None,
'east_pmd': 0.1,
'east_cable': ''
}
class Eqpt(object):
def __init__(self, **kwargs):
super(Eqpt, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v_east = clean_kwargs.get(k, v)
setattr(self, k, v_east)
k = 'west' + k.split('east')[-1]
v_west = clean_kwargs.get(k, v)
setattr(self, k, v_west)
default_values = {
'from_city': '',
'to_city': '',
'east_amp_type': '',
'east_att_in': 0,
'east_amp_gain': None,
'east_amp_dp': None,
'east_tilt': 0,
'east_att_out': None
}
def read_header(my_sheet, line, slice_):
""" return the list of headers !:= ''
header_i = [(header, header_column_index), ...]
in a {line, slice1_x, slice_y} range
"""
Param_header = namedtuple('Param_header', 'header colindex')
try:
header = [x.value.strip() for x in my_sheet.row_slice(line, slice_[0], slice_[1])]
header_i = [Param_header(header, i + slice_[0]) for i, header in enumerate(header) if header != '']
except Exception:
header_i = []
if header_i != [] and header_i[-1].colindex != slice_[1]:
header_i.append(Param_header('', slice_[1]))
return header_i
def read_slice(my_sheet, line, slice_, header):
"""return the slice range of a given header
in a defined range {line, slice_x, slice_y}"""
header_i = read_header(my_sheet, line, slice_)
slice_range = (-1, -1)
if header_i != []:
try:
slice_range = next((h.colindex, header_i[i + 1].colindex)
for i, h in enumerate(header_i) if header in h.header)
except Exception:
pass
return slice_range
def parse_headers(my_sheet, input_headers_dict, headers, start_line, slice_in):
"""return a dict of header_slice
key = column index
value = header name"""
for h0 in input_headers_dict:
slice_out = read_slice(my_sheet, start_line, slice_in, h0)
iteration = 1
while slice_out == (-1, -1) and iteration < 10:
# try next lines
slice_out = read_slice(my_sheet, start_line + iteration, slice_in, h0)
iteration += 1
if slice_out == (-1, -1):
if h0 in ('east', 'Node A', 'Node Z', 'City'):
print(f'{ansi_escapes.red}CRITICAL{ansi_escapes.reset}: missing _{h0}_ header: EXECUTION ENDS')
exit()
else:
print(f'missing header {h0}')
elif not isinstance(input_headers_dict[h0], dict):
headers[slice_out[0]] = input_headers_dict[h0]
else:
headers = parse_headers(my_sheet, input_headers_dict[h0], headers, start_line + 1, slice_out)
if headers == {}:
print(f'{ansi_escapes.red}CRITICAL ERROR{ansi_escapes.reset}: could not find any header to read _ ABORT')
exit()
return headers
def parse_row(row, headers):
return {f: r.value for f, r in
zip([label for label in headers.values()], [row[i] for i in headers])}
def parse_sheet(my_sheet, input_headers_dict, header_line, start_line, column):
headers = parse_headers(my_sheet, input_headers_dict, {}, header_line, (0, column))
for row in all_rows(my_sheet, start=start_line):
yield parse_row(row[0: column], headers)
def sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city):
duplicate_links = []
for l1 in links:
for l2 in links:
if l1 is not l2 and l1 == l2 and l2 not in duplicate_links:
print(f'\nWARNING\n \
link {l1.from_city}-{l1.to_city} is duplicate \
\nthe 1st duplicate link will be removed but you should check Links sheet input')
duplicate_links.append(l1)
for l in duplicate_links:
links.remove(l)
try:
test_nodes = [n for n in nodes_by_city if n not in links_by_city]
test_links = [n for n in links_by_city if n not in nodes_by_city]
test_eqpts = [n for n in eqpts_by_city if n not in nodes_by_city]
assert (test_nodes == [] or test_nodes == [''])\
and (test_links == [] or test_links == [''])\
and (test_eqpts == [] or test_eqpts == [''])
except AssertionError:
msg = f'CRITICAL error in excel input: Names in Nodes and Links sheets do no match, check:\
\n{test_nodes} in Nodes sheet\
\n{test_links} in Links sheet\
\n{test_eqpts} in Eqpt sheet'
raise NetworkTopologyError(msg)
for city, link in links_by_city.items():
if nodes_by_city[city].node_type.lower() == 'ila' and len(link) != 2:
# wrong input: ILA sites can only be Degree 2
# => correct to make it a ROADM and remove entry in links_by_city
# TODO: put in log rather than print
print(f'invalid node type ({nodes_by_city[city].node_type})\
specified in {city}, replaced by ROADM')
nodes_by_city[city].node_type = 'ROADM'
for n in nodes:
if n.city == city:
n.node_type = 'ROADM'
return nodes, links
def xls_to_json_data(input_filename, filter_region=[]):
nodes, links, eqpts = parse_excel(input_filename)
if filter_region:
nodes = [n for n in nodes if n.region.lower() in filter_region]
cities = {n.city for n in nodes}
links = [lnk for lnk in links if lnk.from_city in cities and
lnk.to_city in cities]
cities = {lnk.from_city for lnk in links} | {lnk.to_city for lnk in links}
nodes = [n for n in nodes if n.city in cities]
global nodes_by_city
nodes_by_city = {n.city: n for n in nodes}
global links_by_city
links_by_city = defaultdict(list)
for link in links:
links_by_city[link.from_city].append(link)
links_by_city[link.to_city].append(link)
global eqpts_by_city
eqpts_by_city = defaultdict(list)
for eqpt in eqpts:
eqpts_by_city[eqpt.from_city].append(eqpt)
nodes, links = sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city)
return {
'elements':
[{'uid': f'trx {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Transceiver'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'] +
[{'uid': f'roadm {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'
and x.booster_restriction == '' and x.preamp_restriction == ''] +
[{'uid': f'roadm {x.city}',
'params': {
'restrictions': {
'preamp_variety_list': silent_remove(x.preamp_restriction.split(' | '), ''),
'booster_variety_list': silent_remove(x.booster_restriction.split(' | '), '')
}
},
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm' and
(x.booster_restriction != '' or x.preamp_restriction != '')] +
[{'uid': f'west fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'east fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'fiber ({x.from_city} \u2192 {x.to_city})-{x.east_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.east_fiber,
'params': {'length': round(x.east_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.east_lineic,
'con_in': x.east_con_in,
'con_out': x.east_con_out}
}
for x in links] +
[{'uid': f'fiber ({x.to_city} \u2192 {x.from_city})-{x.west_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.west_fiber,
'params': {'length': round(x.west_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.west_lineic,
'con_in': x.west_con_in,
'con_out': x.west_con_out}
} # missing ILA construction
for x in links] +
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.east_amp_type,
'operational': {'gain_target': e.east_amp_gain,
'delta_p': e.east_amp_dp,
'tilt_target': e.east_tilt,
'out_voa': e.east_att_out}
}
for e in eqpts if (e.east_amp_type.lower() != '' and \
e.east_amp_type.lower() != 'fused')] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.west_amp_type,
'operational': {'gain_target': e.west_amp_gain,
'delta_p': e.west_amp_dp,
'tilt_target': e.west_tilt,
'out_voa': e.west_att_out}
}
for e in eqpts if (e.west_amp_type.lower() != '' and \
e.west_amp_type.lower() != 'fused')] +
# fused edfa variety is a hack to indicate that there should not be
# booster amplifier out the roadm.
# If user specifies ILA in Nodes sheet and fused in Eqpt sheet, then assumes that
# this is a fused nodes.
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.east_amp_type.lower() == 'fused'] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.west_amp_type.lower() == 'fused'],
'connections':
list(chain.from_iterable([eqpt_connection_by_city(n.city)
for n in nodes]))
+
list(chain.from_iterable(zip(
[{'from_node': f'trx {x.city}',
'to_node': f'roadm {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'],
[{'from_node': f'roadm {x.city}',
'to_node': f'trx {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'])))
}
def convert_file(input_filename, filter_region=[], output_json_file_name=None):
data = xls_to_json_data(input_filename, filter_region)
if output_json_file_name is None:
output_json_file_name = input_filename.with_suffix('.json')
with open(output_json_file_name, 'w', encoding='utf-8') as edfa_json_file:
edfa_json_file.write(dumps(data, indent=2, ensure_ascii=False))
return output_json_file_name
def corresp_names(input_filename, network):
""" a function that builds the correspondance between names given in the excel,
and names used in the json, and created by the autodesign.
All names are listed
"""
nodes, links, eqpts = parse_excel(input_filename)
fused = [n.uid for n in network.nodes() if isinstance(n, Fused)]
ila = [n.uid for n in network.nodes() if isinstance(n, Edfa)]
corresp_roadm = {x.city: [f'roadm {x.city}'] for x in nodes
if x.node_type.lower() == 'roadm'}
corresp_fused = {x.city: [f'west fused spans in {x.city}', f'east fused spans in {x.city}']
for x in nodes if x.node_type.lower() == 'fused' and
f'west fused spans in {x.city}' in fused and
f'east fused spans in {x.city}' in fused}
# add the special cases when an ila is changed into a fused
for my_e in eqpts:
name = f'east edfa in {my_e.from_city} to {my_e.to_city}'
if my_e.east_amp_type.lower() == 'fused' and name in fused:
if my_e.from_city in corresp_fused.keys():
corresp_fused[my_e.from_city].append(name)
else:
corresp_fused[my_e.from_city] = [name]
name = f'west edfa in {my_e.from_city} to {my_e.to_city}'
if my_e.west_amp_type.lower() == 'fused' and name in fused:
if my_e.from_city in corresp_fused.keys():
corresp_fused[my_e.from_city].append(name)
else:
corresp_fused[my_e.from_city] = [name]
# build corresp ila based on eqpt sheet
# start with east direction
corresp_ila = {e.from_city: [f'east edfa in {e.from_city} to {e.to_city}']
for e in eqpts if e.east_amp_type.lower() != '' and
f'east edfa in {e.from_city} to {e.to_city}' in ila}
# west direction, append name or create a new item in dict
for my_e in eqpts:
if my_e.west_amp_type.lower() != '':
name = f'west edfa in {my_e.from_city} to {my_e.to_city}'
if name in ila:
if my_e.from_city in corresp_ila.keys():
corresp_ila[my_e.from_city].append(name)
else:
corresp_ila[my_e.from_city] = [name]
# complete with potential autodesign names: amplifiers
for my_l in links:
name = f'Edfa0_fiber ({my_l.to_city} \u2192 {my_l.from_city})-{my_l.west_cable}'
if name in ila:
if my_l.from_city in corresp_ila.keys():
# "east edfa in Stbrieuc to Rennes_STA" is equivalent name as
# "Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056"
# "west edfa in Stbrieuc to Rennes_STA" is equivalent name as
# "Edfa0_fiber (Rennes_STA → Stbrieuc)-F057"
# does not filter names: all types (except boosters) are created.
# in case fibers are splitted the name here is a prefix
corresp_ila[my_l.from_city].append(name)
else:
corresp_ila[my_l.from_city] = [name]
name = f'Edfa0_fiber ({my_l.from_city} \u2192 {my_l.to_city})-{my_l.east_cable}'
if name in ila:
if my_l.to_city in corresp_ila.keys():
corresp_ila[my_l.to_city].append(name)
else:
corresp_ila[my_l.to_city] = [name]
# merge fused with ila:
for key, val in corresp_fused.items():
if key in corresp_ila.keys():
corresp_ila[key].extend(val)
else:
corresp_ila[key] = val
# no need of roadm booster
return corresp_roadm, corresp_fused, corresp_ila
def parse_excel(input_filename):
link_headers = {
'Node A': 'from_city',
'Node Z': 'to_city',
'east': {
'Distance (km)': 'east_distance',
'Fiber type': 'east_fiber',
'lineic att': 'east_lineic',
'Con_in': 'east_con_in',
'Con_out': 'east_con_out',
'PMD': 'east_pmd',
'Cable id': 'east_cable'
},
'west': {
'Distance (km)': 'west_distance',
'Fiber type': 'west_fiber',
'lineic att': 'west_lineic',
'Con_in': 'west_con_in',
'Con_out': 'west_con_out',
'PMD': 'west_pmd',
'Cable id': 'west_cable'
}
}
node_headers = {
'City': 'city',
'State': 'state',
'Country': 'country',
'Region': 'region',
'Latitude': 'latitude',
'Longitude': 'longitude',
'Type': 'node_type',
'Booster_restriction': 'booster_restriction',
'Preamp_restriction': 'preamp_restriction'
}
eqpt_headers = {
'Node A': 'from_city',
'Node Z': 'to_city',
'east': {
'amp type': 'east_amp_type',
'att_in': 'east_att_in',
'amp gain': 'east_amp_gain',
'delta p': 'east_amp_dp',
'tilt': 'east_tilt',
'att_out': 'east_att_out'
},
'west': {
'amp type': 'west_amp_type',
'att_in': 'west_att_in',
'amp gain': 'west_amp_gain',
'delta p': 'west_amp_dp',
'tilt': 'west_tilt',
'att_out': 'west_att_out'
}
}
with open_workbook(input_filename) as wb:
nodes_sheet = wb.sheet_by_name('Nodes')
links_sheet = wb.sheet_by_name('Links')
try:
eqpt_sheet = wb.sheet_by_name('Eqpt')
except Exception:
# eqpt_sheet is optional
eqpt_sheet = None
nodes = []
for node in parse_sheet(nodes_sheet, node_headers, NODES_LINE, NODES_LINE + 1, NODES_COLUMN):
nodes.append(Node(**node))
expected_node_types = {'ROADM', 'ILA', 'FUSED'}
for n in nodes:
if n.node_type not in expected_node_types:
n.node_type = 'ILA'
links = []
for link in parse_sheet(links_sheet, link_headers, LINKS_LINE, LINKS_LINE + 2, LINKS_COLUMN):
links.append(Link(**link))
eqpts = []
if eqpt_sheet is not None:
for eqpt in parse_sheet(eqpt_sheet, eqpt_headers, EQPTS_LINE, EQPTS_LINE + 2, EQPTS_COLUMN):
eqpts.append(Eqpt(**eqpt))
# sanity check
all_cities = Counter(n.city for n in nodes)
if len(all_cities) != len(nodes):
raise ValueError(f'Duplicate city: {all_cities}')
bad_links = []
for lnk in links:
if lnk.from_city not in all_cities or lnk.to_city not in all_cities:
bad_links.append([lnk.from_city, lnk.to_city])
if bad_links:
raise NetworkTopologyError(f'Bad link(s): {bad_links}.')
return nodes, links, eqpts
def eqpt_connection_by_city(city_name):
other_cities = fiber_dest_from_source(city_name)
subdata = []
if nodes_by_city[city_name].node_type.lower() in {'ila', 'fused'}:
# Then len(other_cities) == 2
direction = ['west', 'east']
for i in range(2):
from_ = fiber_link(other_cities[i], city_name)
in_ = eqpt_in_city_to_city(city_name, other_cities[0], direction[i])
to_ = fiber_link(city_name, other_cities[1 - i])
subdata += connect_eqpt(from_, in_, to_)
elif nodes_by_city[city_name].node_type.lower() == 'roadm':
for other_city in other_cities:
from_ = f'roadm {city_name}'
in_ = eqpt_in_city_to_city(city_name, other_city)
to_ = fiber_link(city_name, other_city)
subdata += connect_eqpt(from_, in_, to_)
from_ = fiber_link(other_city, city_name)
in_ = eqpt_in_city_to_city(city_name, other_city, "west")
to_ = f'roadm {city_name}'
subdata += connect_eqpt(from_, in_, to_)
return subdata
def connect_eqpt(from_, in_, to_):
connections = []
if in_ != '':
connections = [{'from_node': from_, 'to_node': in_},
{'from_node': in_, 'to_node': to_}]
else:
connections = [{'from_node': from_, 'to_node': to_}]
return connections
def eqpt_in_city_to_city(in_city, to_city, direction='east'):
rev_direction = 'west' if direction == 'east' else 'east'
amp_direction = f'{direction}_amp_type'
amp_rev_direction = f'{rev_direction}_amp_type'
return_eqpt = ''
if in_city in eqpts_by_city:
for e in eqpts_by_city[in_city]:
if nodes_by_city[in_city].node_type.lower() == 'roadm':
if e.to_city == to_city and getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
elif nodes_by_city[in_city].node_type.lower() == 'ila':
if e.to_city != to_city:
direction = rev_direction
amp_direction = amp_rev_direction
if getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
if nodes_by_city[in_city].node_type.lower() == 'fused':
return_eqpt = f'{direction} fused spans in {in_city}'
return return_eqpt
def corresp_next_node(network, corresp_ila, corresp_roadm):
""" for each name in corresp dictionnaries find the next node in network and its name
given by user in excel. for meshTopology_exampleV2.xls:
user ILA name Stbrieuc covers the two direction. convert.py creates 2 different ILA
with possible names (depending on the direction and if the eqpt was defined in eqpt
sheet)
- east edfa in Stbrieuc to Rennes_STA
- west edfa in Stbrieuc to Rennes_STA
- Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056
- Edfa0_fiber (Rennes_STA → Stbrieuc)-F057
next_nodes finds the user defined name of next node to be able to map the path constraints
- east edfa in Stbrieuc to Rennes_STA next node = Rennes_STA
- west edfa in Stbrieuc to Rennes_STA next node Lannion_CAS
Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056 and Edfa0_fiber (Rennes_STA → Stbrieuc)-F057
do not exist
the function supports fiber splitting, fused nodes and shall only be called if
excel format is used for both network and service
"""
next_node = {}
# consolidate tables and create next_node table
for ila_key, ila_list in corresp_ila.items():
temp = copy(ila_list)
for ila_elem in ila_list:
# find the node with ila_elem string _in_ the node uid. 'in' is used instead of
# '==' to find composed nodes due to fiber splitting in autodesign.
# eg if elem_ila is 'Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056',
# node uid 'Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056_(1/2)' is possible
correct_ila_name = next(n.uid for n in network.nodes() if ila_elem in n.uid)
temp.remove(ila_elem)
temp.append(correct_ila_name)
ila_nd = next(n for n in network.nodes() if ila_elem in n.uid)
next_nd = next(network.successors(ila_nd))
# search for the next ILA or ROADM
while isinstance(next_nd, (Fiber, Fused)):
next_nd = next(network.successors(next_nd))
# if next_nd is a ROADM, add the first found correspondance
for key, val in corresp_roadm.items():
# val is a list of possible names associated with key
if next_nd.uid in val:
next_node[correct_ila_name] = key
break
# if next_nd was not already added in the dict with the previous loop,
# add the first found correspondance in ila names
if correct_ila_name not in next_node.keys():
for key, val in corresp_ila.items():
# in case of splitted fibers the ila name might not be exact match
if [e for e in val if e in next_nd.uid]:
next_node[correct_ila_name] = key
break
corresp_ila[ila_key] = temp
return corresp_ila, next_node
def fiber_dest_from_source(city_name):
destinations = []
links_from_city = links_by_city[city_name]
for l in links_from_city:
if l.from_city == city_name:
destinations.append(l.to_city)
else:
destinations.append(l.from_city)
return destinations
def fiber_link(from_city, to_city):
source_dest = (from_city, to_city)
links = links_by_city[from_city]
link = next(l for l in links if l.from_city in source_dest and l.to_city in source_dest)
if link.from_city == from_city:
fiber = f'fiber ({link.from_city} \u2192 {link.to_city})-{link.east_cable}'
else:
fiber = f'fiber ({link.to_city} \u2192 {link.from_city})-{link.west_cable}'
return fiber
def midpoint(city_a, city_b):
lats = city_a.latitude, city_b.latitude
longs = city_a.longitude, city_b.longitude
try:
result = {
'latitude': sum(lats) / 2,
'longitude': sum(longs) / 2
}
except TypeError:
result = {
'latitude': 0,
'longitude': 0
}
return result
# TODO get column size automatically from tupple size
NODES_COLUMN = 10
NODES_LINE = 4
LINKS_COLUMN = 16
LINKS_LINE = 3
EQPTS_LINE = 3
EQPTS_COLUMN = 14
def _do_convert():
parser = ArgumentParser()
parser.add_argument('workbook', type=Path)
parser.add_argument('-f', '--filter-region', action='append', default=[])
parser.add_argument('--output', type=Path, help='Name of the generated JSON file')
args = parser.parse_args()
res = convert_file(args.workbook, args.filter_region, args.output)
print(f'XLS -> JSON saved to {res}')
if __name__ == '__main__':
_do_convert()

536
gnpy/tools/json_io.py Normal file
View File

@@ -0,0 +1,536 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.json_io
==================
Loading and saving data from JSON files in GNPy's internal data format
'''
from networkx import DiGraph
from logging import getLogger
from pathlib import Path
import json
from collections import namedtuple
from gnpy.core import ansi_escapes, elements
from gnpy.core.equipment import trx_mode_params
from gnpy.core.exceptions import ConfigurationError, EquipmentConfigError, NetworkTopologyError, ServiceError
from gnpy.core.science_utils import estimate_nf_model
from gnpy.core.utils import automatic_nch, automatic_fmax, merge_amplifier_restrictions
from gnpy.topology.request import PathRequest, Disjunction
from gnpy.tools.convert import xls_to_json_data
from gnpy.tools.service_sheet import read_service_sheet
import time
_logger = getLogger(__name__)
Model_vg = namedtuple('Model_vg', 'nf1 nf2 delta_p')
Model_fg = namedtuple('Model_fg', 'nf0')
Model_openroadm = namedtuple('Model_openroadm', 'nf_coef')
Model_hybrid = namedtuple('Model_hybrid', 'nf_ram gain_ram edfa_variety')
Model_dual_stage = namedtuple('Model_dual_stage', 'preamp_variety booster_variety')
class _JsonThing:
def update_attr(self, default_values, kwargs, name):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in default_values.items():
setattr(self, k, clean_kwargs.get(k, v))
if k not in clean_kwargs and name != 'Amp':
print(ansi_escapes.red +
f'\n WARNING missing {k} attribute in eqpt_config.json[{name}]' +
f'\n default value is {k} = {v}' +
ansi_escapes.reset)
time.sleep(1)
class SI(_JsonThing):
default_values = {
"f_min": 191.35e12,
"f_max": 196.1e12,
"baud_rate": 32e9,
"spacing": 50e9,
"power_dbm": 0,
"power_range_db": [0, 0, 0.5],
"roll_off": 0.15,
"tx_osnr": 45,
"sys_margins": 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'SI')
class Span(_JsonThing):
default_values = {
'power_mode': True,
'delta_power_range_db': None,
'max_fiber_lineic_loss_for_raman': 0.25,
'target_extended_gain': 2.5,
'max_length': 150,
'length_units': 'km',
'max_loss': None,
'padding': 10,
'EOL': 0,
'con_in': 0,
'con_out': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Span')
class Roadm(_JsonThing):
default_values = {
'target_pch_out_db': -17,
'add_drop_osnr': 100,
'pmd': 0,
'restrictions': {
'preamp_variety_list': [],
'booster_variety_list': []
}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Roadm')
class Transceiver(_JsonThing):
default_values = {
'type_variety': None,
'frequency': None,
'mode': {}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Transceiver')
class Fiber(_JsonThing):
default_values = {
'type_variety': '',
'dispersion': None,
'gamma': 0,
'pmd_coef': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Fiber')
class RamanFiber(_JsonThing):
default_values = {
'type_variety': '',
'dispersion': None,
'gamma': 0,
'pmd_coef': 0,
'raman_efficiency': None
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'RamanFiber')
for param in ('cr', 'frequency_offset'):
if param not in self.raman_efficiency:
raise EquipmentConfigError(f'RamanFiber.raman_efficiency: missing "{param}" parameter')
if self.raman_efficiency['frequency_offset'] != sorted(self.raman_efficiency['frequency_offset']):
raise EquipmentConfigError(f'RamanFiber.raman_efficiency.frequency_offset is not sorted')
class Amp(_JsonThing):
default_values = {
'f_min': 191.35e12,
'f_max': 196.1e12,
'type_variety': '',
'type_def': '',
'gain_flatmax': None,
'gain_min': None,
'p_max': None,
'nf_model': None,
'dual_stage_model': None,
'nf_fit_coeff': None,
'nf_ripple': None,
'dgt': None,
'gain_ripple': None,
'out_voa_auto': False,
'allowed_for_design': False,
'raman': False
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Amp')
@classmethod
def from_json(cls, filename, **kwargs):
config = Path(filename).parent / 'default_edfa_config.json'
type_variety = kwargs['type_variety']
type_def = kwargs.get('type_def', 'variable_gain') # default compatibility with older json eqpt files
nf_def = None
dual_stage_def = None
if type_def == 'fixed_gain':
try:
nf0 = kwargs.pop('nf0')
except KeyError: # nf0 is expected for a fixed gain amp
raise EquipmentConfigError(f'missing nf0 value input for amplifier: {type_variety} in equipment config')
for k in ('nf_min', 'nf_max'):
try:
del kwargs[k]
except KeyError:
pass
nf_def = Model_fg(nf0)
elif type_def == 'advanced_model':
config = Path(filename).parent / kwargs.pop('advanced_config_from_json')
elif type_def == 'variable_gain':
gain_min, gain_max = kwargs['gain_min'], kwargs['gain_flatmax']
try: # nf_min and nf_max are expected for a variable gain amp
nf_min = kwargs.pop('nf_min')
nf_max = kwargs.pop('nf_max')
except KeyError:
raise EquipmentConfigError(f'missing nf_min or nf_max value input for amplifier: {type_variety} in equipment config')
try: # remove all remaining nf inputs
del kwargs['nf0']
except KeyError:
pass # nf0 is not needed for variable gain amp
nf1, nf2, delta_p = estimate_nf_model(type_variety, gain_min, gain_max, nf_min, nf_max)
nf_def = Model_vg(nf1, nf2, delta_p)
elif type_def == 'openroadm':
try:
nf_coef = kwargs.pop('nf_coef')
except KeyError: # nf_coef is expected for openroadm amp
raise EquipmentConfigError(f'missing nf_coef input for amplifier: {type_variety} in equipment config')
nf_def = Model_openroadm(nf_coef)
elif type_def == 'dual_stage':
try: # nf_ram and gain_ram are expected for a hybrid amp
preamp_variety = kwargs.pop('preamp_variety')
booster_variety = kwargs.pop('booster_variety')
except KeyError:
raise EquipmentConfigError(f'missing preamp/booster variety input for amplifier: {type_variety} in equipment config')
dual_stage_def = Model_dual_stage(preamp_variety, booster_variety)
json_data = load_json(config)
return cls(**{**kwargs, **json_data,
'nf_model': nf_def, 'dual_stage_model': dual_stage_def})
def _automatic_spacing(baud_rate):
"""return the min possible channel spacing for a given baud rate"""
# TODO : this should parametrized in a cfg file
# list of possible tuples [(max_baud_rate, spacing_for_this_baud_rate)]
spacing_list = [(33e9, 37.5e9), (38e9, 50e9), (50e9, 62.5e9), (67e9, 75e9), (92e9, 100e9)]
return min((s[1] for s in spacing_list if s[0] > baud_rate), default=baud_rate * 1.2)
def load_equipment(filename):
json_data = load_json(filename)
return _equipment_from_json(json_data, filename)
def _update_trx_osnr(equipment):
"""add sys_margins to all Transceivers OSNR values"""
for trx in equipment['Transceiver'].values():
for m in trx.mode:
m['OSNR'] = m['OSNR'] + equipment['SI']['default'].sys_margins
return equipment
def _update_dual_stage(equipment):
edfa_dict = equipment['Edfa']
for edfa in edfa_dict.values():
if edfa.type_def == 'dual_stage':
edfa_preamp = edfa_dict[edfa.dual_stage_model.preamp_variety]
edfa_booster = edfa_dict[edfa.dual_stage_model.booster_variety]
for key, value in edfa_preamp.__dict__.items():
attr_k = 'preamp_' + key
setattr(edfa, attr_k, value)
for key, value in edfa_booster.__dict__.items():
attr_k = 'booster_' + key
setattr(edfa, attr_k, value)
edfa.p_max = edfa_booster.p_max
edfa.gain_flatmax = edfa_booster.gain_flatmax + edfa_preamp.gain_flatmax
if edfa.gain_min < edfa_preamp.gain_min:
raise EquipmentConfigError(f'Dual stage {edfa.type_variety} minimal gain is lower than its preamp minimal gain')
return equipment
def _roadm_restrictions_sanity_check(equipment):
""" verifies that booster and preamp restrictions specified in roadm equipment are listed
in the edfa.
"""
restrictions = equipment['Roadm']['default'].restrictions['booster_variety_list'] + \
equipment['Roadm']['default'].restrictions['preamp_variety_list']
for amp_name in restrictions:
if amp_name not in equipment['Edfa']:
raise EquipmentConfigError(f'ROADM restriction {amp_name} does not refer to a defined EDFA name')
def _equipment_from_json(json_data, filename):
"""build global dictionnary eqpt_library that stores all eqpt characteristics:
edfa type type_variety, fiber type_variety
from the eqpt_config.json (filename parameter)
also read advanced_config_from_json file parameters for edfa if they are available:
typically nf_ripple, dfg gain ripple, dgt and nf polynomial nf_fit_coeff
if advanced_config_from_json file parameter is not present: use nf_model:
requires nf_min and nf_max values boundaries of the edfa gain range
"""
equipment = {}
for key, entries in json_data.items():
equipment[key] = {}
for entry in entries:
subkey = entry.get('type_variety', 'default')
if key == 'Edfa':
equipment[key][subkey] = Amp.from_json(filename, **entry)
elif key == 'Fiber':
equipment[key][subkey] = Fiber(**entry)
elif key == 'Span':
equipment[key][subkey] = Span(**entry)
elif key == 'Roadm':
equipment[key][subkey] = Roadm(**entry)
elif key == 'SI':
equipment[key][subkey] = SI(**entry)
elif key == 'Transceiver':
equipment[key][subkey] = Transceiver(**entry)
elif key == 'RamanFiber':
equipment[key][subkey] = RamanFiber(**entry)
else:
raise EquipmentConfigError(f'Unrecognized network element type "{key}"')
equipment = _update_trx_osnr(equipment)
equipment = _update_dual_stage(equipment)
_roadm_restrictions_sanity_check(equipment)
return equipment
def load_network(filename, equipment):
if filename.suffix.lower() in ('.xls', '.xlsx'):
json_data = xls_to_json_data(filename)
elif filename.suffix.lower() == '.json':
json_data = load_json(filename)
else:
raise ValueError(f'unsupported topology filename extension {filename.suffix.lower()}')
return network_from_json(json_data, equipment)
def save_network(network: DiGraph, filename: str):
'''Dump the network into a JSON file
:param network: network to work on
:param filename: file to write to
'''
save_json(network_to_json(network), filename)
def _cls_for(equipment_type):
if equipment_type == 'Edfa':
return elements.Edfa
if equipment_type == 'Fused':
return elements.Fused
elif equipment_type == 'Roadm':
return elements.Roadm
elif equipment_type == 'Transceiver':
return elements.Transceiver
elif equipment_type == 'Fiber':
return elements.Fiber
elif equipment_type == 'RamanFiber':
return elements.RamanFiber
else:
raise ConfigurationError(f'Unknown network equipment "{equipment_type}"')
def network_from_json(json_data, equipment):
# NOTE|dutc: we could use the following, but it would tie our data format
# too closely to the graph library
# from networkx import node_link_graph
g = DiGraph()
for el_config in json_data['elements']:
typ = el_config.pop('type')
variety = el_config.pop('type_variety', 'default')
cls = _cls_for(typ)
if typ == 'Fused':
# well, there's no variety for the 'Fused' node type
pass
elif variety in equipment[typ]:
extra_params = equipment[typ][variety]
temp = el_config.setdefault('params', {})
temp = merge_amplifier_restrictions(temp, extra_params.__dict__)
el_config['params'] = temp
el_config['type_variety'] = variety
elif typ in ['Edfa', 'Fiber', 'RamanFiber']: # catch it now because the code will crash later!
raise ConfigurationError(f'The {typ} of variety type {variety} was not recognized:'
'\nplease check it is properly defined in the eqpt_config json file')
el = cls(**el_config)
g.add_node(el)
nodes = {k.uid: k for k in g.nodes()}
for cx in json_data['connections']:
from_node, to_node = cx['from_node'], cx['to_node']
try:
if isinstance(nodes[from_node], elements.Fiber):
edge_length = nodes[from_node].params.length
else:
edge_length = 0.01
g.add_edge(nodes[from_node], nodes[to_node], weight=edge_length)
except KeyError:
raise NetworkTopologyError(f'can not find {from_node} or {to_node} defined in {cx}')
return g
def network_to_json(network):
data = {
'elements': [n.to_json for n in network]
}
connections = {
'connections': [{"from_node": n.uid,
"to_node": next_n.uid}
for n in network
for next_n in network.successors(n) if next_n is not None]
}
data.update(connections)
return data
def load_json(filename):
with open(filename, 'r', encoding='utf-8') as f:
data = json.load(f)
return data
def save_json(obj, filename):
with open(filename, 'w', encoding='utf-8') as f:
json.dump(obj, f, indent=2, ensure_ascii=False)
def load_requests(filename, eqpt, bidir, network, network_filename):
""" loads the requests from a json or an excel file into a data string
"""
if filename.suffix.lower() in ('.xls', '.xlsx'):
_logger.info('Automatically converting requests from XLS to JSON')
try:
return convert_service_sheet(filename, eqpt, network, network_filename=network_filename, bidir=bidir)
except ServiceError as this_e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {this_e}')
exit(1)
else:
return load_json(filename)
def requests_from_json(json_data, equipment):
"""Extract list of requests from data parsed from JSON"""
requests_list = []
for req in json_data['path-request']:
# init all params from request
params = {}
params['request_id'] = req['request-id']
params['source'] = req['source']
params['bidir'] = req['bidirectional']
params['destination'] = req['destination']
params['trx_type'] = req['path-constraints']['te-bandwidth']['trx_type']
params['trx_mode'] = req['path-constraints']['te-bandwidth']['trx_mode']
params['format'] = params['trx_mode']
params['spacing'] = req['path-constraints']['te-bandwidth']['spacing']
try:
nd_list = req['explicit-route-objects']['route-object-include-exclude']
except KeyError:
nd_list = []
params['nodes_list'] = [n['num-unnum-hop']['node-id'] for n in nd_list]
params['loose_list'] = [n['num-unnum-hop']['hop-type'] for n in nd_list]
# recover trx physical param (baudrate, ...) from type and mode
# in trx_mode_params optical power is read from equipment['SI']['default'] and
# nb_channel is computed based on min max frequency and spacing
trx_params = trx_mode_params(equipment, params['trx_type'], params['trx_mode'], True)
params.update(trx_params)
# print(trx_params['min_spacing'])
# optical power might be set differently in the request. if it is indicated then the
# params['power'] is updated
try:
if req['path-constraints']['te-bandwidth']['output-power']:
params['power'] = req['path-constraints']['te-bandwidth']['output-power']
except KeyError:
pass
# same process for nb-channel
f_min = params['f_min']
f_max_from_si = params['f_max']
try:
if req['path-constraints']['te-bandwidth']['max-nb-of-channel'] is not None:
nch = req['path-constraints']['te-bandwidth']['max-nb-of-channel']
params['nb_channel'] = nch
spacing = params['spacing']
params['f_max'] = automatic_fmax(f_min, spacing, nch)
else:
params['nb_channel'] = automatic_nch(f_min, f_max_from_si, params['spacing'])
except KeyError:
params['nb_channel'] = automatic_nch(f_min, f_max_from_si, params['spacing'])
_check_one_request(params, f_max_from_si)
try:
params['path_bandwidth'] = req['path-constraints']['te-bandwidth']['path_bandwidth']
except KeyError:
pass
requests_list.append(PathRequest(**params))
return requests_list
def _check_one_request(params, f_max_from_si):
"""Checks that the requested parameters are consistant (spacing vs nb channel vs transponder mode...)"""
f_min = params['f_min']
f_max = params['f_max']
max_recommanded_nb_channels = automatic_nch(f_min, f_max, params['spacing'])
if params['baud_rate'] is not None:
# implicitly means that a mode is defined with min_spacing
if params['min_spacing'] > params['spacing']:
msg = f'Request {params["request_id"]} has spacing below transponder ' +\
f'{params["trx_type"]} {params["trx_mode"]} min spacing value ' +\
f'{params["min_spacing"]*1e-9}GHz.\nComputation stopped'
print(msg)
_logger.critical(msg)
raise ServiceError(msg)
if f_max > f_max_from_si:
msg = f'''Requested channel number {params["nb_channel"]}, baud rate {params["baud_rate"]} GHz
and requested spacing {params["spacing"]*1e-9}GHz is not consistent with frequency range
{f_min*1e-12} THz, {f_max*1e-12} THz, min recommanded spacing {params["min_spacing"]*1e-9}GHz.
max recommanded nb of channels is {max_recommanded_nb_channels}.'''
_logger.critical(msg)
raise ServiceError(msg)
def disjunctions_from_json(json_data):
""" reads the disjunction requests from the json dict and create the list
of requested disjunctions for this set of requests
"""
disjunctions_list = []
try:
temp_test = json_data['synchronization']
except KeyError:
temp_test = []
if temp_test:
for snc in json_data['synchronization']:
params = {}
params['disjunction_id'] = snc['synchronization-id']
params['relaxable'] = snc['svec']['relaxable']
params['link_diverse'] = 'link' in snc['svec']['disjointness']
params['node_diverse'] = 'node' in snc['svec']['disjointness']
params['disjunctions_req'] = snc['svec']['request-id-number']
disjunctions_list.append(Disjunction(**params))
return disjunctions_list
def convert_service_sheet(
input_filename,
eqpt,
network,
network_filename=None,
output_filename='',
bidir=False,
filter_region=None):
if output_filename == '':
output_filename = f'{str(input_filename)[0:len(str(input_filename))-len(str(input_filename.suffixes[0]))]}_services.json'
data = read_service_sheet(input_filename, eqpt, network, network_filename, bidir, filter_region)
save_json(data, output_filename)
return data

82
gnpy/tools/plots.py Executable file
View File

@@ -0,0 +1,82 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.plots
================
Graphs and plots usable form a CLI application
'''
from matplotlib.pyplot import show, axis, figure, title, text
from networkx import draw_networkx_nodes, draw_networkx_edges, draw_networkx_labels
from gnpy.core.elements import Transceiver
def plot_baseline(network):
edges = set(network.edges())
pos = {n: (n.lng, n.lat) for n in network.nodes()}
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
plot = draw_networkx_nodes(network, nodelist=network.nodes(), node_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
axis('off')
show()
def plot_results(network, path, source, destination, infos):
path_edges = set(zip(path[:-1], path[1:]))
edges = set(network.edges()) - path_edges
pos = {n: (n.lng, n.lat) for n in network.nodes()}
nodes = {}
for k, (x, y) in pos.items():
nodes.setdefault((round(x, 1), round(y, 1)), []).append(k)
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
all_nodes = [n for n in network.nodes() if n not in path]
plot = draw_networkx_nodes(network, nodelist=all_nodes, node_color='#ababab', node_size=50, **kwargs)
draw_networkx_nodes(network, nodelist=path, node_color='#ff0000', node_size=55, **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=path_edges, edge_color='#ff0000', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
title(f'Propagating from {source.loc.city} to {destination.loc.city}')
axis('off')
heading = 'Spectral Information\n\n'
textbox = text(0.85, 0.20, heading, fontsize=14, fontname='Ubuntu Mono',
verticalalignment='top', transform=fig.axes[0].transAxes,
bbox={'boxstyle': 'round', 'facecolor': 'wheat', 'alpha': 0.5})
msgs = {(x, y): heading + '\n\n'.join(str(n) for n in ns if n in path)
for (x, y), ns in nodes.items()}
def hover(event):
if event.xdata is None or event.ydata is None:
return
if fig.contains(event):
x, y = round(event.xdata, 1), round(event.ydata, 1)
if (x, y) in msgs:
textbox.set_text(msgs[x, y])
else:
textbox.set_text(heading)
fig.canvas.draw_idle()
fig.canvas.mpl_connect('motion_notify_event', hover)
show()

381
gnpy/tools/service_sheet.py Normal file
View File

@@ -0,0 +1,381 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.tools.service_sheet
========================
XLS parser that can be called to create a JSON request file in accordance with
Yang model for requesting path computation.
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from xlrd import open_workbook, XL_CELL_EMPTY
from collections import namedtuple
from logging import getLogger
from copy import deepcopy
from gnpy.core.utils import db2lin
from gnpy.core.exceptions import ServiceError
from gnpy.core.elements import Transceiver, Roadm, Edfa, Fiber
import gnpy.core.ansi_escapes as ansi_escapes
from gnpy.tools.convert import corresp_names, corresp_next_node
SERVICES_COLUMN = 12
def all_rows(sheet, start=0):
return (sheet.row(x) for x in range(start, sheet.nrows))
logger = getLogger(__name__)
class Request(namedtuple('Request', 'request_id source destination trx_type mode \
spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
def __new__(cls, request_id, source, destination, trx_type, mode=None, spacing=None, power=None, nb_channel=None, disjoint_from='', nodes_list=None, is_loose='', path_bandwidth=None):
return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)
class Element:
def __eq__(self, other):
return type(self) == type(other) and self.uid == other.uid
def __hash__(self):
return hash((type(self), self.uid))
class Request_element(Element):
def __init__(self, Request, equipment, bidir):
# request_id is str
# excel has automatic number formatting that adds .0 on integer values
# the next lines recover the pure int value, assuming this .0 is unwanted
self.request_id = correct_xlrd_int_to_str_reading(Request.request_id)
self.source = f'trx {Request.source}'
self.destination = f'trx {Request.destination}'
# TODO: the automatic naming generated by excel parser requires that source and dest name
# be a string starting with 'trx' : this is manually added here.
self.srctpid = f'trx {Request.source}'
self.dsttpid = f'trx {Request.destination}'
self.bidir = bidir
# test that trx_type belongs to eqpt_config.json
# if not replace it with a default
try:
if equipment['Transceiver'][Request.trx_type]:
self.trx_type = correct_xlrd_int_to_str_reading(Request.trx_type)
if Request.mode is not None:
Requestmode = correct_xlrd_int_to_str_reading(Request.mode)
if [mode for mode in equipment['Transceiver'][Request.trx_type].mode if mode['format'] == Requestmode]:
self.mode = Requestmode
else:
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Requestmode}\' in eqpt library \nComputation stopped.'
# print(msg)
logger.critical(msg)
raise ServiceError(msg)
else:
Requestmode = None
self.mode = Request.mode
except KeyError:
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Request.mode}\' in eqpt library \nComputation stopped.'
# print(msg)
logger.critical(msg)
raise ServiceError(msg)
# excel input are in GHz and dBm
if Request.spacing is not None:
self.spacing = Request.spacing * 1e9
else:
msg = f'Request {self.request_id} missing spacing: spacing is mandatory.\ncomputation stopped'
logger.critical(msg)
raise ServiceError(msg)
if Request.power is not None:
self.power = db2lin(Request.power) * 1e-3
else:
self.power = None
if Request.nb_channel is not None:
self.nb_channel = int(Request.nb_channel)
else:
self.nb_channel = None
value = correct_xlrd_int_to_str_reading(Request.disjoint_from)
self.disjoint_from = [n for n in value.split(' | ') if value]
self.nodes_list = []
if Request.nodes_list:
self.nodes_list = Request.nodes_list.split(' | ')
self.loose = 'LOOSE'
if Request.is_loose.lower() == 'no':
self.loose = 'STRICT'
self.path_bandwidth = None
if Request.path_bandwidth is not None:
self.path_bandwidth = Request.path_bandwidth * 1e9
else:
self.path_bandwidth = 0
uid = property(lambda self: repr(self))
@property
def pathrequest(self):
# Default assumption for bidir is False
req_dictionnary = {
'request-id': self.request_id,
'source': self.source,
'destination': self.destination,
'src-tp-id': self.srctpid,
'dst-tp-id': self.dsttpid,
'bidirectional': self.bidir,
'path-constraints': {
'te-bandwidth': {
'technology': 'flexi-grid',
'trx_type': self.trx_type,
'trx_mode': self.mode,
'effective-freq-slot': [{'N': 'null', 'M': 'null'}],
'spacing': self.spacing,
'max-nb-of-channel': self.nb_channel,
'output-power': self.power
}
}
}
if self.nodes_list:
req_dictionnary['explicit-route-objects'] = {}
temp = {'route-object-include-exclude': [
{'explicit-route-usage': 'route-include-ero',
'index': self.nodes_list.index(node),
'num-unnum-hop': {
'node-id': f'{node}',
'link-tp-id': 'link-tp-id is not used',
'hop-type': f'{self.loose}',
}
}
for node in self.nodes_list]
}
req_dictionnary['explicit-route-objects'] = temp
if self.path_bandwidth is not None:
req_dictionnary['path-constraints']['te-bandwidth']['path_bandwidth'] = self.path_bandwidth
return req_dictionnary
@property
def pathsync(self):
if self.disjoint_from:
return {'synchronization-id': self.request_id,
'svec': {
'relaxable': 'false',
'disjointness': 'node link',
'request-id-number': [self.request_id] + [n for n in self.disjoint_from]
}
}
else:
return None
# TO-DO: avoid multiple entries with same synchronisation vectors
@property
def json(self):
return self.pathrequest, self.pathsync
def read_service_sheet(
input_filename,
eqpt,
network,
network_filename=None,
bidir=False,
filter_region=None):
""" converts a service sheet into a json structure
"""
if filter_region is None:
filter_region = []
if network_filename is None:
network_filename = input_filename
service = parse_excel(input_filename)
req = [Request_element(n, eqpt, bidir) for n in service]
req = correct_xls_route_list(network_filename, network, req)
# if there is no sync vector , do not write any synchronization
synchro = [n.json[1] for n in req if n.json[1] is not None]
if synchro:
data = {
'path-request': [n.json[0] for n in req],
'synchronization': synchro
}
else:
data = {
'path-request': [n.json[0] for n in req]
}
return data
def correct_xlrd_int_to_str_reading(v):
if not isinstance(v, str):
value = str(int(v))
if value.endswith('.0'):
value = value[:-2]
else:
value = v
return value
def parse_row(row, fieldnames):
return {f: r.value for f, r in zip(fieldnames, row[0:SERVICES_COLUMN])
if r.ctype != XL_CELL_EMPTY}
def parse_excel(input_filename):
with open_workbook(input_filename) as wb:
service_sheet = wb.sheet_by_name('Service')
services = list(parse_service_sheet(service_sheet))
return services
def parse_service_sheet(service_sheet):
""" reads each column according to authorized fieldnames. order is not important.
"""
logger.info(f'Validating headers on {service_sheet.name!r}')
# add a test on field to enable the '' field case that arises when columns on the
# right hand side are used as comments or drawing in the excel sheet
header = [x.value.strip() for x in service_sheet.row(4)[0:SERVICES_COLUMN]
if len(x.value.strip()) > 0]
# create a service_fieldname independant from the excel column order
# to be compatible with any version of the sheet
# the following dictionnary records the excel field names and the corresponding parameter's name
authorized_fieldnames = {
'route id': 'request_id', 'Source': 'source', 'Destination': 'destination',
'TRX type': 'trx_type', 'Mode': 'mode', 'System: spacing': 'spacing',
'System: input power (dBm)': 'power', 'System: nb of channels': 'nb_channel',
'routing: disjoint from': 'disjoint_from', 'routing: path': 'nodes_list',
'routing: is loose?': 'is_loose', 'path bandwidth': 'path_bandwidth'}
try:
service_fieldnames = [authorized_fieldnames[e] for e in header]
except KeyError:
msg = f'Malformed header on Service sheet: {header} field not in {authorized_fieldnames}'
logger.critical(msg)
raise ValueError(msg)
for row in all_rows(service_sheet, start=5):
yield Request(**parse_row(row[0:SERVICES_COLUMN], service_fieldnames))
def correct_xls_route_list(network_filename, network, pathreqlist):
""" prepares the format of route list of nodes to be consistant with nodes names:
remove wrong names, find correct names for ila, roadm and fused if the entry was
xls.
if it was not xls, all names in list should be exact name in the network.
"""
# first loads the base correspondance dict built with excel naming
corresp_roadm, corresp_fused, corresp_ila = corresp_names(network_filename, network)
# then correct dict names with names of the autodisign and find next_node name
# according to xls naming
corresp_ila, next_node = corresp_next_node(network, corresp_ila, corresp_roadm)
# finally correct constraints based on these dict
trxfibertype = [n.uid for n in network.nodes() if isinstance(n, (Transceiver, Fiber))]
roadmtype = [n.uid for n in network.nodes() if isinstance(n, Roadm)]
edfatype = [n.uid for n in network.nodes() if isinstance(n, Edfa)]
# TODO there is a problem of identification of fibers in case of parallel
# fibers between two adjacent roadms so fiber constraint is not supported
transponders = [n.uid for n in network.nodes() if isinstance(n, Transceiver)]
for pathreq in pathreqlist:
# first check that source and dest are transceivers
if pathreq.source not in transponders:
msg = f'{ansi_escapes.red}Request: {pathreq.request_id}: could not find' +\
f' transponder source : {pathreq.source}.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
if pathreq.destination not in transponders:
msg = f'{ansi_escapes.red}Request: {pathreq.request_id}: could not find' +\
f' transponder destination: {pathreq.destination}.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
# silently pop source and dest nodes from the list if they were added by the user as first
# and last elem in the constraints respectively. Other positions must lead to an error
# caught later on
if pathreq.nodes_list and pathreq.source == pathreq.nodes_list[0]:
pathreq.loose_list.pop(0)
pathreq.nodes_list.pop(0)
if pathreq.nodes_list and pathreq.destination == pathreq.nodes_list[-1]:
pathreq.loose_list.pop(-1)
pathreq.nodes_list.pop(-1)
# Then process user defined constraints with respect to automatic namings
temp = deepcopy(pathreq)
# This needs a temporary object since we may suppress/correct elements in the list
# during the process
for i, n_id in enumerate(temp.nodes_list):
# n_id must not be a transceiver and must not be a fiber (non supported, user
# can not enter fiber names in excel)
if n_id not in trxfibertype:
# check that n_id is in the node list, if not find a correspondance name
if n_id in roadmtype + edfatype:
nodes_suggestion = [n_id]
else:
# checks first roadm, fused, and ila in this order, because ila automatic name
# contain roadm names. If it is a fused node, next ila names might be correct
# suggestions, especially if following fibers were splitted and ila names
# created with the name of the fused node
if n_id in corresp_roadm.keys():
nodes_suggestion = corresp_roadm[n_id]
elif n_id in corresp_fused.keys():
nodes_suggestion = corresp_fused[n_id] + corresp_ila[n_id]
elif n_id in corresp_ila.keys():
nodes_suggestion = corresp_ila[n_id]
else:
nodes_suggestion = []
if nodes_suggestion:
try:
if len(nodes_suggestion) > 1:
# if there is more than one suggestion, we need to choose the direction
# we rely on the next node provided by the user for this purpose
new_n = next(n for n in nodes_suggestion
if n in next_node.keys() and next_node[n]
in temp.nodes_list[i:] + [pathreq.destination] and
next_node[n] not in temp.nodes_list[:i])
else:
new_n = nodes_suggestion[0]
if new_n != n_id:
# warns the user when the correct name is used only in verbose mode,
# eg 'a' is a roadm and correct name is 'roadm a' or when there was
# too much ambiguity, 'b' is an ila, its name can be:
# Edfa0_fiber (a → b)-xx if next node is c or
# Edfa0_fiber (c → b)-xx if next node is a
msg = f'{ansi_escapes.yellow}Invalid route node specified:' +\
f'\n\t\'{n_id}\', replaced with \'{new_n}\'{ansi_escapes.reset}'
logger.info(msg)
pathreq.nodes_list[pathreq.nodes_list.index(n_id)] = new_n
except StopIteration:
# shall not come in this case, unless requested direction does not exist
msg = f'{ansi_escapes.yellow}Invalid route specified {n_id}: could' +\
f' not decide on direction, skipped!.\nPlease add a valid' +\
f' direction in constraints (next neighbour node){ansi_escapes.reset}'
print(msg)
logger.info(msg)
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
if temp.loose_list[i] == 'LOOSE':
# if no matching can be found in the network just ignore this constraint
# if it is a loose constraint
# warns the user that this node is not part of the topology
msg = f'{ansi_escapes.yellow}Invalid node specified:\n\t\'{n_id}\'' +\
f', could not use it as constraint, skipped!{ansi_escapes.reset}'
print(msg)
logger.info(msg)
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
msg = f'{ansi_escapes.red}Could not find node:\n\t\'{n_id}\' in network' +\
f' topology. Strict constraint can not be applied.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
else:
if temp.loose_list[i] == 'LOOSE':
print(f'{ansi_escapes.yellow}Invalid route node specified:\n\t\'{n_id}\'' +
f' type is not supported as constraint with xls network input,' +
f' skipped!{ansi_escapes.reset}')
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
msg = f'{ansi_escapes.red}Invalid route node specified \n\t\'{n_id}\'' +\
f' type is not supported as constraint with xls network input,' +\
f', Strict constraint can not be applied.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
return pathreqlist

View File

@@ -0,0 +1,3 @@
'''
Tracking :py:mod:`.request` for spectrum and their :py:mod:`.spectrum_assignment`.
'''

1169
gnpy/topology/request.py Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,431 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.topology.spectrum_assignment
=================================
This module contains the :class:`Oms` and :class:`Bitmap` classes and methods to
select and assign spectrum. The :func:`spectrum_selection` function identifies the free
slots and :func:`select_candidate` selects the candidate spectrum according to
strategy: for example first fit
oms records its elements, and elements are updated with an oms to have
element/oms correspondace
"""
from collections import namedtuple
from logging import getLogger
from math import ceil
from gnpy.core.elements import Roadm, Transceiver
from gnpy.core.exceptions import ServiceError, SpectrumError
LOGGER = getLogger(__name__)
class Bitmap:
""" records the spectrum occupation
"""
def __init__(self, f_min, f_max, grid, guardband=0.15e12, bitmap=None):
# n is the min index including guardband. Guardband is require to be sure
# that a channel can be assigned with center frequency fmin (means that its
# slot occupation goes below freq_index_min
n_min = frequency_to_n(f_min - guardband, grid)
n_max = frequency_to_n(f_max + guardband, grid) - 1
self.n_min = n_min
self.n_max = n_max
self.freq_index_min = frequency_to_n(f_min)
self.freq_index_max = frequency_to_n(f_max)
self.freq_index = list(range(n_min, n_max + 1))
if bitmap is None:
self.bitmap = [1] * (n_max - n_min + 1)
elif len(bitmap) == len(self.freq_index):
self.bitmap = bitmap
else:
raise SpectrumError(f'bitmap is not consistant with f_min{f_min} - n: {n_min} and f_max{f_max}- n :{n_max}')
def getn(self, i):
""" converts the n (itu grid) into a local index
"""
return self.freq_index[i]
def geti(self, nvalue):
""" converts the local index into n (itu grid)
"""
return self.freq_index.index(nvalue)
def insert_left(self, newbitmap):
""" insert bitmap on the left to align oms bitmaps if their start frequencies are different
"""
self.bitmap = newbitmap + self.bitmap
temp = list(range(self.n_min - len(newbitmap), self.n_min))
self.freq_index = temp + self.freq_index
self.n_min = self.freq_index[0]
def insert_right(self, newbitmap):
""" insert bitmap on the right to align oms bitmaps if their stop frequencies are different
"""
self.bitmap = self.bitmap + newbitmap
self.freq_index = self.freq_index + list(range(self.n_max, self.n_max + len(newbitmap)))
self.n_max = self.freq_index[-1]
# +'grid available_slots f_min f_max services_list')
OMSParams = namedtuple('OMSParams', 'oms_id el_id_list el_list')
class OMS:
""" OMS class is the logical container that represent a link between two adjacent ROADMs and
records the crossed elements and the occupied spectrum
"""
def __init__(self, *args, **params):
params = OMSParams(**params)
self.oms_id = params.oms_id
self.el_id_list = params.el_id_list
self.el_list = params.el_list
self.spectrum_bitmap = []
self.nb_channels = 0
self.service_list = []
# TODO
def __str__(self):
return '\n\t'.join([f'{type(self).__name__} {self.oms_id}',
f'{self.el_id_list[0]} - {self.el_id_list[-1]}'])
def __repr__(self):
return '\n\t'.join([f'{type(self).__name__} {self.oms_id}',
f'{self.el_id_list[0]} - {self.el_id_list[-1]}', '\n'])
def add_element(self, elem):
""" records oms elements
"""
self.el_id_list.append(elem.uid)
self.el_list.append(elem)
def update_spectrum(self, f_min, f_max, guardband=0.15e12, existing_spectrum=None,
grid=0.00625e12):
""" frequencies expressed in Hz
"""
if existing_spectrum is None:
# add some 150 GHz margin to enable a center channel on f_min
# use ITU-T G694.1
# Flexible DWDM grid definition
# For the flexible DWDM grid, the allowed frequency slots have a nominal
# central frequency (in THz) defined by:
# 193.1 + n × 0.00625 where n is a positive or negative integer including 0
# and 0.00625 is the nominal central frequency granularity in THz
# and a slot width defined by:
# 12.5 × m where m is a positive integer and 12.5 is the slot width granularity in
# GHz.
# Any combination of frequency slots is allowed as long as no two frequency
# slots overlap.
# TODO : add explaination on that / parametrize ....
self.spectrum_bitmap = Bitmap(f_min, f_max, grid, guardband)
# print(len(self.spectrum_bitmap.bitmap))
def assign_spectrum(self, nvalue, mvalue):
""" change oms spectrum to mark spectrum assigned
"""
if not isinstance(nvalue, int):
raise SpectrumError(f'N must be a signed integer, got {nvalue}')
if not isinstance(mvalue, int):
raise SpectrumError(f'M must be an integer, got {mvalue}')
if mvalue <= 0:
raise SpectrumError(f'M must be positive, got {mvalue}')
if nvalue > self.spectrum_bitmap.freq_index_max:
raise SpectrumError(f'N {nvalue} over the upper spectrum boundary')
if nvalue < self.spectrum_bitmap.freq_index_min:
raise SpectrumError(f'N {nvalue} below the lower spectrum boundary')
startn, stopn = mvalue_to_slots(nvalue, mvalue)
if stopn > self.spectrum_bitmap.n_max:
raise SpectrumError(f'N {nvalue}, M {mvalue} over the N spectrum bitmap bounds')
if startn <= self.spectrum_bitmap.n_min:
raise SpectrumError(f'N {nvalue}, M {mvalue} below the N spectrum bitmap bounds')
self.spectrum_bitmap.bitmap[self.spectrum_bitmap.geti(startn):self.spectrum_bitmap.geti(stopn) + 1] = [0] * (stopn - startn + 1)
def add_service(self, service_id, nb_wl):
""" record service and mark spectrum as occupied
"""
self.service_list.append(service_id)
self.nb_channels += nb_wl
def frequency_to_n(freq, grid=0.00625e12):
""" converts frequency into the n value (ITU grid)
reference to Recommendation G.694.1 (02/12), Figure I.3
https://www.itu.int/rec/T-REC-G.694.1-201202-I/en
>>> frequency_to_n(193.1375e12)
6
>>> frequency_to_n(193.225e12)
20
"""
return (int)((freq - 193.1e12) / grid)
def nvalue_to_frequency(nvalue, grid=0.00625e12):
""" converts n value into a frequency
reference to Recommendation G.694.1 (02/12), Table 1
https://www.itu.int/rec/T-REC-G.694.1-201202-I/en
>>> nvalue_to_frequency(6)
193137500000000.0
>>> nvalue_to_frequency(-1, 0.1e12)
193000000000000.0
"""
return 193.1e12 + nvalue * grid
def mvalue_to_slots(nvalue, mvalue):
""" convert center n an m into start and stop n
"""
startn = nvalue - mvalue
stopn = nvalue + mvalue - 1
return startn, stopn
def slots_to_m(startn, stopn):
""" converts the start and stop n values to the center n and m value
reference to Recommendation G.694.1 (02/12), Figure I.3
https://www.itu.int/rec/T-REC-G.694.1-201202-I/en
>>> nval, mval = slots_to_m(6, 20)
>>> nval
13
>>> mval
7
"""
nvalue = (int)((startn + stopn + 1) / 2)
mvalue = (int)((stopn - startn + 1) / 2)
return nvalue, mvalue
def m_to_freq(nvalue, mvalue, grid=0.00625e12):
""" converts m into frequency range
spectrum(13,7) is (193137500000000.0, 193225000000000.0)
reference to Recommendation G.694.1 (02/12), Figure I.3
https://www.itu.int/rec/T-REC-G.694.1-201202-I/en
>>> fstart, fstop = m_to_freq(13, 7)
>>> fstart
193137500000000.0
>>> fstop
193225000000000.0
"""
startn, stopn = mvalue_to_slots(nvalue, mvalue)
fstart = nvalue_to_frequency(startn, grid)
fstop = nvalue_to_frequency(stopn + 1, grid)
return fstart, fstop
def align_grids(oms_list):
""" used to apply same grid to all oms : same starting n, stop n and slot size
out of grid slots are set to 0
"""
n_min = min([o.spectrum_bitmap.n_min for o in oms_list])
n_max = max([o.spectrum_bitmap.n_max for o in oms_list])
for this_o in oms_list:
if (this_o.spectrum_bitmap.n_min - n_min) > 0:
this_o.spectrum_bitmap.insert_left([0] * (this_o.spectrum_bitmap.n_min - n_min))
if (n_max - this_o.spectrum_bitmap.n_max) > 0:
this_o.spectrum_bitmap.insert_right([0] * (n_max - this_o.spectrum_bitmap.n_max))
return oms_list
def build_oms_list(network, equipment):
""" initialization of OMS list in the network
an oms is build reading all intermediate nodes between two adjacent ROADMs
each element within the list is being added an oms and oms_id to record the
oms it belongs to.
the function supports different spectrum width and supposes that the whole network
works with the min range among OMSs
"""
oms_id = 0
oms_list = []
for node in [n for n in network.nodes() if isinstance(n, Roadm)]:
for edge in network.edges([node]):
if not isinstance(edge[1], Transceiver):
nd_in = edge[0] # nd_in is a Roadm
try:
nd_in.oms_list.append(oms_id)
except AttributeError:
nd_in.oms_list = []
nd_in.oms_list.append(oms_id)
nd_out = edge[1]
params = {}
params['oms_id'] = oms_id
params['el_id_list'] = []
params['el_list'] = []
oms = OMS(**params)
oms.add_element(nd_in)
while not isinstance(nd_out, Roadm):
oms.add_element(nd_out)
# add an oms_id in the element
nd_out.oms_id = oms_id
nd_out.oms = oms
n_temp = nd_out
nd_out = next(n[1] for n in network.edges([n_temp]) if n[1].uid != nd_in.uid)
nd_in = n_temp
oms.add_element(nd_out)
# nd_out is a Roadm
try:
nd_out.oms_list.append(oms_id)
except AttributeError:
nd_out.oms_list = []
nd_out.oms_list.append(oms_id)
oms.update_spectrum(equipment['SI']['default'].f_min,
equipment['SI']['default'].f_max, grid=0.00625e12)
# oms.assign_spectrum(13,7) gives back (193137500000000.0, 193225000000000.0)
# as in the example in the standard
# oms.assign_spectrum(13,7)
oms_list.append(oms)
oms_id += 1
oms_list = align_grids(oms_list)
reversed_oms(oms_list)
return oms_list
def reversed_oms(oms_list):
""" identifies reversed OMS
only applicable for non parallel OMS
"""
for oms in oms_list:
has_reversed = False
for this_o in oms_list:
if (oms.el_id_list[0] == this_o.el_id_list[-1] and
oms.el_id_list[-1] == this_o.el_id_list[0]):
oms.reversed_oms = this_o
has_reversed = True
break
if not has_reversed:
oms.reversed_oms = None
def bitmap_sum(band1, band2):
"""mark occupied bitmap by 0 if the slot is occupied in band1 or in band2"""
res = []
for i, elem in enumerate(band1):
if band2[i] * elem == 0:
res.append(0)
else:
res.append(1)
return res
def spectrum_selection(pth, oms_list, requested_m, requested_n=None):
"""Collects spectrum availability and call the select_candidate function"""
# use indexes instead of ITU-T n values
path_oms = []
for elem in pth:
if not isinstance(elem, Roadm) and not isinstance(elem, Transceiver):
# only edfa, fused and fibers have oms_id attribute
path_oms.append(elem.oms_id)
# remove duplicate oms_id, order is not important
path_oms = list(set(path_oms))
# assuming all oms have same freq index
if not path_oms:
candidate = (None, None, None)
return candidate, path_oms
freq_index = oms_list[path_oms[0]].spectrum_bitmap.freq_index
freq_index_min = oms_list[path_oms[0]].spectrum_bitmap.freq_index_min
freq_index_max = oms_list[path_oms[0]].spectrum_bitmap.freq_index_max
freq_availability = oms_list[path_oms[0]].spectrum_bitmap.bitmap
for oms in path_oms[1:]:
freq_availability = bitmap_sum(oms_list[oms].spectrum_bitmap.bitmap, freq_availability)
if requested_n is None:
# avoid slots reserved on the edge 0.15e-12 on both sides -> 24
candidates = [(freq_index[i] + requested_m, freq_index[i], freq_index[i] + 2 * requested_m - 1)
for i in range(len(freq_availability))
if freq_availability[i:i + 2 * requested_m] == [1] * (2 * requested_m)
and freq_index[i] >= freq_index_min
and freq_index[i + 2 * requested_m - 1] <= freq_index_max]
candidate = select_candidate(candidates, policy='first_fit')
else:
i = oms_list[path_oms[0]].spectrum_bitmap.geti(requested_n)
# print(f'N {requested_n} i {i}')
# print(freq_availability[i-m:i+m] )
# print(freq_index[i-m:i+m])
if (freq_availability[i - requested_m:i + requested_m] == [1] * (2 * requested_m) and
freq_index[i - requested_m] >= freq_index_min
and freq_index[i + requested_m - 1] <= freq_index_max):
# candidate is the triplet center_n, startn and stopn
candidate = (requested_n, requested_n - requested_m, requested_n + requested_m - 1)
else:
candidate = (None, None, None)
# print("coucou11")
# print(candidate)
# print(freq_availability[321:321+2*m])
# a = [i+321 for i in range(2*m)]
# print(a)
# print(candidate)
return candidate, path_oms
def select_candidate(candidates, policy):
""" selects a candidate among all available spectrum
"""
if policy == 'first_fit':
if candidates:
return candidates[0]
else:
return (None, None, None)
else:
raise ServiceError('Only first_fit spectrum assignment policy is implemented.')
def pth_assign_spectrum(pths, rqs, oms_list, rpths):
""" basic first fit assignment
if reversed path are provided, means that occupation is bidir
"""
for i, pth in enumerate(pths):
# computes the number of channels required
try:
if rqs[i].blocking_reason:
rqs[i].blocked = True
rqs[i].N = 0
rqs[i].M = 0
except AttributeError:
nb_wl = ceil(rqs[i].path_bandwidth / rqs[i].bit_rate)
# computes the total nb of slots according to requested spacing
# TODO : express superchannels
# assumes that all channels must be grouped
# TODO : enables non contiguous reservation in case of blocking
requested_m = ceil(rqs[i].spacing / 0.0125e12) * nb_wl
# concatenate all path and reversed path elements to derive slots availability
(center_n, startn, stopn), path_oms = spectrum_selection(pth + rpths[i], oms_list, requested_m,
requested_n=None)
# checks that requested_m is fitting startm and stopm
# if not None, center_n and start, stop frequencies are applicable to all oms of pth
# checks that spectrum is not None else indicate blocking reason
if center_n is not None:
# checks that requested_m is fitting startm and stopm
if 2 * requested_m > (stopn - startn + 1):
msg = f'candidate: {(center_n, startn, stopn)} is not consistant ' +\
f'with {requested_m}'
LOGGER.critical(msg)
raise ValueError(msg)
for oms_elem in path_oms:
oms_list[oms_elem].assign_spectrum(center_n, requested_m)
oms_list[oms_elem].add_service(rqs[i].request_id, nb_wl)
rqs[i].blocked = False
rqs[i].N = center_n
rqs[i].M = requested_m
else:
rqs[i].blocked = True
rqs[i].N = 0
rqs[i].M = 0
rqs[i].blocking_reason = 'NO_SPECTRUM'

3
hooks/pre-commit Executable file
View File

@@ -0,0 +1,3 @@
#!/bin/sh
exec pytest

Some files were not shown because too many files have changed in this diff Show More