327 Commits

Author SHA1 Message Date
EstherLerouzic
f6dab0477b Add the possibilty to input id instead of explicit topo or eqpt
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I81c9fd56773e6f998bc2bdf87fc2aef817e252a4
2021-03-02 15:25:26 +01:00
manuedelf
ce4a226615 add autodesign routes
Signed-off-by: manuedelf <59697943+edelfour@users.noreply.github.com>
2021-01-25 23:11:05 +01:00
manuedelf
78fc0c0680 fix error in if
Signed-off-by: manuedelf <59697943+edelfour@users.noreply.github.com>
2021-01-05 16:24:25 +01:00
manuedelf
2c4f2fbb12 Add equipments and topolgies endpoints
- add POST, PUT, DELETE on equipments
- add POST, PUT, GET, DELETE on topogies
- path-computation request body can now have equipment id and/or
topology id instead of full data
- activate embedded https of Flask while waiting for real trusted
certificate
- update readme
- add request payload samples in yang directory
- equipment data are encrypted with Fernet

Signed-off-by: manuedelf <59697943+edelfour@users.noreply.github.com>
2020-12-23 15:06:02 +01:00
manuedelf
63545c86ed Put api in a dedicated python package
Signed-off-by: manuedelf <59697943+edelfour@users.noreply.github.com>
2020-12-22 13:49:46 +01:00
manuedelf
aa78d00158 remove obsolete example 2020-11-18 14:16:58 +01:00
EstherLerouzic
6a0e73e332 add an example request (with answers)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Id75fff88cd3b03fcf965c22763075ac3dbea41c6
2020-11-06 18:20:13 +01:00
EstherLerouzic
fa6b8c87e4 add 'gnpy-api:' context when reading the content of the request
in order to be compliant with yang

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ifa6ab93025b18a5a678b625e42e3d351499c69d7
2020-11-06 17:39:03 +01:00
EstherLerouzic
801c66aae2 adding yang corresponding to the json inputs
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I75b0cc3c3ce84dc724e588f918bddf0a5a97225d
2020-11-06 17:39:03 +01:00
EstherLerouzic
f60c347a48 support missing trx_mode in request instead of null value
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I5c05b17b0b134c7782a08e86015dc30c7c9b3713
2020-11-06 17:39:03 +01:00
EstherLerouzic
649bb3bd0f Change N values from 0 to None in case of NO_SPECTRUM
in case spectrum can not be assigned default values for N is set to 0,
which is not correct (N is a meaningfull value for
center frequency index). This changes replaces this default
value with None

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ibe642682e48d09f340d53e2092f172de6aa7cc90
2020-11-06 17:38:28 +01:00
EstherLerouzic
6051ad54bc Enabling the reading of N and M value from the json request
For this commit only the first element from the {N, M} list is read
and assigned.

This is better than not reading this value at all.

the commit also updates test_files and test data files with correct
values for the effective_freq_slot attribute

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I1e60fe833ca1092b40de27c8cbfb13083810414e
2020-11-06 14:44:13 +01:00
EstherLerouzic
95d24d8e20 Avoid overwriting blocking reason
When a path is blocked for 'NO_FEASIBLE_MODE' reason, and bidir is true,
the request attributes are filled with the last explored mode values
(baudrate notably), and the reversed path is propagated with this last
explored mode specs. if this reversed path is also not feasible the blocking
reason was overwritten with a 'MODE_NOT_FESIBLE' reasonn, because
baudrate is filled in the request attribute.

This change ensure that the blocking reason (if it exists) is not overwritten.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: If80a37d77e2b967a327562c733a44e7f78f1c544
2020-11-06 14:43:54 +01:00
manuedelf
6c449edece docker image update + readme 2020-10-16 23:34:22 +02:00
manuedelf
d051f93d55 Rest api for GNPy 2020-10-15 16:27:18 +02:00
Jan Kundrát
ee92011b21 docs: fix description of gnpy-path-request
Since about f9e0d18 and 94a8f35, we've changed the way how the equipment
config (and also the network topology) is passed around. Update the
README to reflect this.

Change-Id: I338b790fd4d54914c49e5e0aac3f44f2bc9d00ee
2020-06-18 12:14:24 +02:00
Jan Kundrát
9861a22ef9 docs: do not talk about branches anymore
We talked about this during the coders call on 2020-06-02. When this
project started, it was decided to keep the `master` branch essentially
frozen between releases, and concentrate development into the `develop`
branch. Since that time, we've got a decent CI coverage and end-to-end
tests which should guard us against regressions and inaccurate
simulation results. It also seems that our users have a preference to
use tagged releases anyway, so we can get by by deprecating the
`develop` branch.

As agreed during the conf call, from this release on we'll be merging
approved changes directly into the `master` branch, and the `develop`
branch will be retired. We have measures in place that ought to limit
the risk of regression within `master` to an acceptable level.

Change-Id: I0048ba8e36810237667e6930380a9c8570090a26
2020-06-18 11:48:01 +02:00
Jan Kundrát
59f9f35817 CI: provision secrets for uploads to PyPI
Originally I was hoping that this is possible via the parent job
collection (oopt-zuul-jobs), but that's not the case, unfortunately, and
I have to do this in this project.

I created the token at PyPI's own web UI for my account, and saved the
secret to a file. Then I executed Zuul's helper script:

  $ ~/work/prog/openstack/zuul/zuul/tools/encrypt_secret.py \
    --infile secret-pypi --tenant TelecomInfraProject \
    https://zuul.vexxhost.dev/ Telecominfraproject/oopt-gnpy

Change-Id: I05ab853ae26a449212e832e0f7e261c9ed71e364
2020-06-17 23:01:33 +02:00
Jan Kundrát
1bb475671d docs: badges: show code coverage as well
One yellowish item among the badges, now that should make it look legit
and real 🌈.

Change-Id: I13adadb1edc538428d24cc9d50c0569f49c5d7a3
2020-06-17 21:52:09 +02:00
Jan Kundrát
566dedbdbb docs: badges: show code quality via LGTM.com
It gives us an A, so we can link there :).

Change-Id: I0613394e135bf3de3dfd9e984f970328a5aa2fa4
2020-06-17 21:43:08 +02:00
Jan Kundrát
b44c4cec16 docs: boast about the contributor count
...because why not, right?

Change-Id: Ib853d7b3496f2b7d915785e72bfa314bf82a2802
2020-06-17 21:31:06 +02:00
Jan Kundrát
93d11ba408 docs: badges: prepare for a future link to Zuul
Change-Id: Iff5ffa1f6d1623c8d5b612f7d899c4e8f3f7cb85
2020-06-17 21:20:57 +02:00
Jan Kundrát
637670fcfa docs: Move the gnpy-path-requests into README
This looks like something that's been "always" part of the Excel guide.
I think it is better placed in the README. After the releae we migth
want to improve the docs even more, but hey, that's something which
should be done after some discussion, not via these commits that I'm
going to self-approve shortly.

Change-Id: Icd73f4bb3a43f1a684ec7a364e4ebcc9b8e6af88
2020-06-17 21:19:13 +02:00
Jan Kundrát
7836297708 README: point all badges to the master branch
We want to deprecate `develop`, so let's start pointing to master now.

Change-Id: Ibf69c36f44fde7081238508d55137c8df21cfa60
2020-06-17 21:02:21 +02:00
Jan Kundrát
1f8b4ab9a2 docs: Move install instructions into the generated docs
The PowerShellSessionLexer within pygments has troubles parsing the
example as something valid, so let's revert to a generic one.

Change-Id: I188e1d7bf2f7229ad15c1f0443584344fd11bf84
2020-06-17 20:43:37 +02:00
Jan Kundrát
05eb312f4a docs: Do not run rstcheck on sphinx-consumed docs
Turns out that we're already running Sphinx with options that are strict
enough to perform all sorts of checks. The `rstcheck` standalone tool is
too strict, it warns about legitimate Sphinx constructs (hence the
ignore list), and also happens to issue false warnings. Let's rely on
Sphinx for stuff below docs/, but still invoke rstcheck for the
top-level README (and anything that's not included in the Sphinx docs,
really, which is right now the README, AUTHORS, and a JSON-specific doc
that I have yet to convert).

Bug: https://github.com/myint/rstcheck/issues/19
Change-Id: Ib787d11e7d21452570618288acdf15b3e85270a7
2020-06-17 20:43:37 +02:00
Jan Kundrát
a98e244abd docs: Fix Pygments highlighting
This will make it possible to use this document within Sphinx without
warnings.

Change-Id: I069cdc42b451102d4e731c8d848716126b0e518a
2020-06-17 20:23:58 +02:00
Jan Kundrát
c945bc40fe docs: move JSON and XLS instructions into the generated docs
Change-Id: I659dd8e53663286b1382d1786f46c5341bf7ea44
2020-06-17 20:23:58 +02:00
Jan Kundrát
749b9287a9 Merge remote-tracking branch 'origin/develop' in preparation of release 2.2
Change-Id: I3d503a9c1609e174f8b0a2e11afb646c879099cc
2020-06-17 12:55:12 +02:00
Jan Kundrát
202c76bd6e CI: enable Zuul builds
I'm hoping that merging this change will make Zuul handle that merge
commit.

Change-Id: I31fcc353ee6044355ebbd3ff61bc2ba1b33eb2c7
2020-06-17 12:50:15 +02:00
Jan Kundrát
eec0943ca2 Merge "Upload releases to PyPI upon tagging" into develop 2020-06-13 14:12:19 +00:00
Alessio Ferrari
06d59a5834 Introduce polarization mode dispersion (PMD)
Change-Id: I687591df4662884b734ec945e9968713019ea0fc
2020-06-12 09:08:22 +02:00
Alessio Ferrari
94949d955b Introduce computation of the chromatic dispersion
Change-Id: I3ee039154568d4255444fa8db5e89945851010f4
2020-06-12 08:45:46 +02:00
Alessio Ferrari
b74d0a4919 Add dispersion slope
Change-Id: Iced385787896793437be410a189c67e05da87714
2020-06-12 08:38:51 +02:00
Alessio Ferrari
c8fa7635e0 Bug fix in converting the dispersion D in beta2
The actual conversion formula includes the minus (-), not the absolute
value. We never noticed it as GNPy simulates only modern networks
based on uncompensated transmission which have not DCUs. In this case,
the sign of beta2 along a path is the same for all the spans and,
in this case, the actual amount of NLI does not change.

Change-Id: I60a61d00c578a1a0436231a2bda8e3b6256fc8b3
2020-06-12 07:35:40 +02:00
Jan Kundrát
4c6cfbda5d Merge changes from topic "moving-examples" into develop
* changes:
  Enable saving the network as converted from XLS
  Save either to JSON or to CSV, not to both
  CLI: Allow Raman in path_requests_run
  CLI: Unify handling of the network topology
  Remove unused variables
  CLI: show default values in --help
  Unify handling of the --equipment option in examples
  CLI: specify shared code options just once
  Tweak the --help output
  Remove unused statements
  XLS -> JSON conversion: add a nice program for this
  transmission_main_example: Do not write out a CSV file
  tests: don't clutter up the source dir with generated CSVs
  use load_json instead of open coding
  tests: Do not produce JSON files in the source tree
  Split JSON export from service XLS reading
  tests: Do not create JSON files in the source tree
  Do not always write out JSON data when reading XLS files
  Remove incomplete support for "fuzzy name matching"
  distribute example data along GNPy
  Do not create *_auto_design.json by default
  tests: remove something which looks like a path, but is not a valid path
  tests: show that the examples still work when directly invoked
  examples: add some additional descriptions help context
  Distribute our examples via setuptools
  tests: call our example entry points via functions
  examples: prepare for overriding sys.args
  examples: move path_requests_run to gnpy.tools
  examples: move transmission_main_example into gnpy.tools
  examples: use ansi_escapes
  examples: manual coding style tweaks
  examples: autopep8 -aaaaaaaaaa
  examples: autopep8
2020-06-11 20:31:54 +00:00
Jan Kundrát
80e9423590 Upload releases to PyPI upon tagging
Change-Id: Ia0543a5a091069e9503c8ba6a09930c35a0104f7
2020-06-10 16:39:09 +02:00
Jan Kundrát
78010aaaef Enable saving the network as converted from XLS
This is a feature that was requested by Esther due to their workflows at
Orange. There's also a standalone converter avaialable as
`gnpy-convert-xls`.

Change-Id: I1a483d168db0744fbf115e05e679e13b57d79398
2020-06-10 14:27:05 +02:00
Jan Kundrát
3857ab1dbb Save either to JSON or to CSV, not to both
I don't see a reason for that; the old debug texts were actively
misleading.

Change-Id: I2089bf8f87ec994770cc272e054650e18da4ef2a
2020-06-10 14:15:50 +02:00
Jan Kundrát
dbb09e4108 CLI: Allow Raman in path_requests_run
There's no need to limit this to just the transmission_main_example, so
let's unify this handling.

Change-Id: I585f407c7f80da12fd33baf7261c35c736d78df2
2020-06-10 14:11:44 +02:00
Jan Kundrát
94a8f3568a CLI: Unify handling of the network topology
Change-Id: I51c98ae13218715862fe9f585b2cc4b079498bee
2020-06-10 14:11:42 +02:00
Jan Kundrát
ae7c9321d0 Remove unused variables
They are ever read from, not even in any commented-out debugging code,
so let's nuke these.

Change-Id: I3188511adb28242dc40418cb3bb90b38bc4fdb14
2020-06-10 14:09:52 +02:00
Jan Kundrát
648cc3a8e5 CLI: show default values in --help
I think this is a little bit cleaner that duplicating the info about the
default for some of these options.

Change-Id: If218a26ede3e71628f4839b4e505c4f4aa217699
2020-06-10 14:09:52 +02:00
Jan Kundrát
f9e0d18a9d Unify handling of the --equipment option in examples
Let's use the --option format instead of positional arguments; that way
it's more obvious that it can be omitted. Note that this constitutes a
change of behavior for the path_requests_run example.

Change-Id: Ic6653cf419e1a8573c3585190a88fc51500f549d
2020-06-10 14:09:52 +02:00
Jan Kundrát
2d57fd9f85 CLI: specify shared code options just once
Change-Id: I79d2c9dfd630ee72ff1b87d4b24025a20d1e2ce2
2020-06-10 14:09:49 +02:00
Jan Kundrát
f053f32301 Tweak the --help output
Try to indicate whether an option takes just JSON, or a JSON or an XLS
file. Also add some extra descriptions.

Change-Id: Ifb81d46f6ac659da79b08201a414822e9c318a1e
2020-06-10 14:07:40 +02:00
Jan Kundrát
8e9d715e9f Remove unused statements
Change-Id: Ib859254468f02f7384d736fa9d7120a0ad1aaa15
2020-06-10 12:14:41 +02:00
Jan Kundrát
9fd55a5289 XLS -> JSON conversion: add a nice program for this
Esther mentioned that it is useful for her to be able to convert from
XLS files to JSON files. Let's add a full blown script for this.

I've also taken the liberty to refactor the code a bit so that there's
no default value, and to modernize everything with pathlib a little bit.

Change-Id: I80e50fc1280003910242ce1ff9fc9ae66e6d275b
2020-06-10 12:14:41 +02:00
Jan Kundrát
914d0dbecd transmission_main_example: Do not write out a CSV file
We talked about this earlier today on a call, and agreed with Esther and
Alessio that this is probably a relict from the past. The file does not
appear to contain much useful information, anyway, so let's try to
remove it and wait if someone complains.

Change-Id: I215eeb37498b28b15ece2300f4bbdd184ac52f4a
2020-06-10 12:14:41 +02:00
Jan Kundrát
c38fe72ff7 tests: don't clutter up the source dir with generated CSVs
Change-Id: Ice9287dea2ed16ece2594e21eeab4c69e927947e
2020-06-08 20:35:58 +02:00
Jan Kundrát
80f63d32ed use load_json instead of open coding
Change-Id: I43cfbb7272bfdd834fad63e6715932ff45aeac0b
2020-06-08 20:06:11 +02:00
Jan Kundrát
9030f8f84f tests: Do not produce JSON files in the source tree
Change-Id: I7b9c65b93ba2ce64beef4de050b44c49a85163b7
2020-06-08 19:43:51 +02:00
Jan Kundrát
0d5f1c7d80 Split JSON export from service XLS reading
We have a better module for this.

Change-Id: Id4b68d3ddb119f27df3dfac52277981558fc5e50
2020-06-08 18:47:15 +02:00
Jan Kundrát
5bc42332cd tests: Do not create JSON files in the source tree
Change-Id: I5ef71cc82466367fa9e9eea4d3f02453a4d2f469
2020-06-08 18:31:24 +02:00
Jan Kundrát
093b85d4a3 Do not always write out JSON data when reading XLS files
There's no reason for this, in fact, the code got easier to read when
that detour to disk gets removed.

Change-Id: I45db215898da962e625a7fea6eda57744e21ff8a
2020-06-08 18:31:21 +02:00
Jan Kundrát
0efa0d310d Remove incomplete support for "fuzzy name matching"
This is something which got added in bc9eee32, but it never got
finalized to have a user-visible effect. To the best of my knowledge, it
only created a file which was never used.

I removed code which created that file in 0d542f22, so let's clean up
the rest.

I think this should also restore functionality of running convert.py in
a standalone mode. Looking at the ArgParser, the invocation never
considered the names_matching parameter.

Change-Id: Id0f4aa1db2d22233f74fb273176168a16ace4072
2020-06-08 18:30:38 +02:00
Jan Kundrát
8eb5980ca9 distribute example data along GNPy
I would like to create a package for distribution to PIP, and this seems
like the path of least resistance.

This is, apparently, the way for shippign arbitrary data with Python
[1]. I've at least tried to make it user-firendly via adding a simple
utility which just prints out whatever that data path is.

[1] https://python-packaging.readthedocs.io/en/latest/non-code-files.html

Change-Id: I220ecad84b1d57d01e3f98f15befc700bd97c0b8
2020-06-08 18:30:36 +02:00
Jan Kundrát
754be7ca08 Do not create *_auto_design.json by default
It's a bad habit to write files into the source code repository. It will
also become impossible if gnpy is installed into a systemwide, possible
read-only location.

The old behavior can be reactivated by using an extra option to tell
GNPy where to put the generated file.

Change-Id: I9ad43890ca5886f516885de5938a4778870a06c5
2020-06-08 18:28:59 +02:00
Jan Kundrát
c5c5b693f2 tests: remove something which looks like a path, but is not a valid path
As I'm moving the top-level directory `examples/` to another place, I
wanted to clear the source of any mentions of examples which are not
actually valid paths.

Change-Id: If6cce20feacfbbb79549e865d06aa00fd2dcd08d
2020-06-08 18:28:59 +02:00
Jan Kundrát
7f816eb6e7 tests: show that the examples still work when directly invoked
Since Ic4a124a5cbe2bd24c56e4565d27d313fe4da703f, there was no automated
test which would check if the generated examples *really* work. When I
was playing with this, I managed to break it at least once (especially
when working on overriding sys.args, i.e.,
I53833a5513abae0abd57065a49c0f357890e0820).

This now requires an equivalent of `pip install` before the tests can be
run.

Change-Id: I595a3efe29b3ee13800c5cb71f28a5f370988617
2020-06-08 18:28:59 +02:00
Jan Kundrát
1009b44d2a examples: add some additional descriptions help context
The main reason for doing this is becasue of the next commit which
re-adds testing of the generated wrappers.

Change-Id: I7137c6cf7a5b414fc708a15b125eaf88e996366c
2020-06-08 18:28:56 +02:00
Jan Kundrát
3b61c6ca4c Distribute our examples via setuptools
Cc issue #352.

Change-Id: I31e67130540fabeb2666dea38da6c435236e7f5b
2020-06-05 18:04:53 +02:00
Jonas Mårtensson
cc11bd186c Update compute_constrained_path
This patch proposes a new implementation of the compute_constrained_path function based on the same method as the newly proposed compute_k_constrained_paths function, i.e. using shortest_simple_paths instead of all_simple_paths. This method is more efficient and avoids having to set a cutoff parameter. The new implementation should be identical to the old one from an external perspective, except that it finds a path with include node constraints in more cases.

Change-Id: Ia93b61c0af27076ed5088013bc87787a2920b629
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-06-03 20:57:39 +02:00
Jan Kundrát
0d1225824e tests: call our example entry points via functions
I would like to avoid that extra fork to a child Python interpreter (it
looks like something that can be easily avoided). It's something that's
possible now that the code ships just some trivial wrappers (which are,
in turn, needed for setuptools' `console_scripts`).

This cannot use the `capsysbinary` fixture for wrapping of stdout/stderr
due to something in pytest which already got fixed, but has not been
released yet (May 2020). Let's use `capfdbinary` which works fine.

Change-Id: Ic4a124a5cbe2bd24c56e4565d27d313fe4da703f
See-also: https://github.com/pytest-dev/pytest/pull/6926
2020-06-03 19:29:17 +02:00
Jan Kundrát
a2ecfd924c examples: prepare for overriding sys.args
...which will be done in the next commit. One has to be careful with
sys.argv here because it uses different indexing than when passing args
explicitly.

Change-Id: I53833a5513abae0abd57065a49c0f357890e0820
2020-06-03 19:29:03 +02:00
Jan Kundrát
95c3c9c488 examples: move path_requests_run to gnpy.tools
Change-Id: Id23dfa81bf9a9b347ea9c99f77d5735b62911d67
2020-06-03 19:22:53 +02:00
Jan Kundrát
11033a284f examples: move transmission_main_example into gnpy.tools
This converts our first free-standing example entry point into a
function. It will become very handy when we start distributing these
entry points via setuptools.

Change-Id: Icd2e4658337f93cd0b0301978e2dc640de0cc72b
2020-06-03 19:06:31 +02:00
Jan Kundrát
357e6cbf18 examples: use ansi_escapes
Change-Id: I958b00a7d7c079d9330cc0f4aec1f72b4a102223
2020-06-03 18:32:16 +02:00
Jan Kundrát
bfa6d29908 examples: manual coding style tweaks
Change-Id: Ie0ac42561afad331b8d22df0576ef071f358c5ed
2020-06-03 18:29:07 +02:00
Jan Kundrát
21e0589e7f examples: autopep8 -aaaaaaaaaa
Change-Id: I811bc80d9eca9df34c554362de62f98a897cbb05
2020-06-03 18:18:19 +02:00
Jan Kundrát
4d836246ec examples: autopep8
Change-Id: Ieed050fdfb9db57722b2b8de642c25ad2e4a50eb
2020-06-03 18:17:47 +02:00
Jan Kundrát
566943a099 Merge "Support propagation of single channel" into develop 2020-05-30 10:17:38 +00:00
Jan Kundrát
a4c1ea1f55 CI: Restore functionality via running on Python 3.6 on native packages
We have declared that we're supporting Python 3.6 as the minimal
required version of Python. Let's enforce this via testing. Let's also
do it via a long-term supported distro (CentOS8 in this case) so that we
can have someone else maintain stuff for us; it's easier to rely on
these prebuilt packages instead of downloading Python from upstream all
the time.

Also, Python 3.7 builds are currently broken in the CI because of some
breakage, either in the upstream zuul-jobs repo, or in Vexxhost's VM
images. So, now that we have Python 3.6 VMs available, use this to
restore CI functionality. That part can be reverted in future when
Debian Buster-based builds are working again.

I've also tried to use Python 3.8 on latest Fedora (version 32). There
were a couple of gotchas, including /tmp on tmpfs with not enough space
for building all required packages. Also, there's another issue with
building Pandas from scratch, so let's defer using Python 3.8 until
later. It's something that I will definitely look into, though, because
of significant speed improvements in that distro.

Change-Id: Ic19bf58875ba6d6afaee77f5c9b467261ec686d0
Depends-on: https://review.gerrithub.io/c/Telecominfraproject/oopt-zuul-jobs/+/494597
2020-05-30 11:55:43 +02:00
Jan Kundrát
6b10ed15f8 Restructuring and reorganization of the library structure
Change-Id: I111922113858526c21b848b28c86a80d5c013d65
2020-05-26 19:32:38 +02:00
Jan Kundrát
c8daa5ed8c docs: basic stuff about the module structure
As a bonus, in the Python shell, `help(gnpy)`, `help(gnpy.core)`, etc,
now produce at least some useful information.

Change-Id: I76ade6f2456fcebd3c0a147374815dd245dc4b10
2020-05-26 18:36:30 +02:00
Jan Kundrát
0b03725295 tests: fix import of correct_json_route_list
It got moved in de58d7d7c2, let's import
it directly, not via examples/.

Change-Id: I0c36fcfdb775baac6fafc05ecd41ac39bafd750d
2020-05-23 22:14:52 +02:00
Jan Kundrát
ee5e64408d examples: common code for data loading
This also moves SimParams handling to a single place. As a result,
path_requests_run has just become Raman-aware (to the minimal possible
extent, OK).

Change-Id: I4e31af5c67335963ddab567d304f48a899cd569e
2020-05-23 22:03:23 +02:00
Jan Kundrát
b96ffe6c7b examples: unify exception handling
Change-Id: I7a6ac6ba54c597214bb85ee0dda2d6313b677730
2020-05-23 22:03:23 +02:00
Jan Kundrát
3b981853d4 SimParams: pretty exceptions upon a misconfiguration
Change-Id: I69f47925dd08c492b295b6824618a4595d142954
2020-05-23 22:03:23 +02:00
Jan Kundrát
6fa3ef8df1 params: Remove extra if
`kwargs` is always a dict, so it's still safe to prove individual
elements in there.

Change-Id: I2dea3159764807a2ff6a96d613ff69c68a7ba3ce
2020-05-23 22:03:23 +02:00
Jan Kundrát
1b2b048b47 reorganization: move request processing and request disjunction handling into topology.requests
Change-Id: I14902a2e15cc5fd27530cb294f4f549f6974fd49
2020-05-23 22:03:23 +02:00
Jan Kundrát
94c5281260 flake8: Remove unused variables
Change-Id: I67ff11f2580690f58dbe9909a92f8725a65bd7c7
2020-05-23 21:29:59 +02:00
Jan Kundrát
7da4ec08d8 reorganization: move JSON-specific bits from path_requests_run to tools.json_io
Change-Id: I03cf4e298300387ee5d56251e2e552e111b59f65
2020-05-23 21:28:06 +02:00
Jan Kundrát
2b473d26d3 reorganization: move example plotting into an extra submodule
Change-Id: Ibb7a81f360493ad8c1b7e58303e2e256c64dd755
2020-05-23 20:44:51 +02:00
Jan Kundrát
9e74e8b0a0 De-JSONify gnpy.core
Change-Id: I2657f3174209bef5912c7d0809fee876830ad11c
2020-05-23 20:44:51 +02:00
Jan Kundrát
648039521e reorganization: move all JSON processing into an extra module
We agreed that `gnpy.core` should only contain stuff for propagating
wavelengths. Conceptually, JSON parsing and even instantiating these
network elements from data obtained through JSON is *not* something that
is on the same level -- and this will become more important when we move
into YANG format in future.

Also, instead of former `gnpy.core.equipment.common`, use
`gnpy.tools.json_io._JsonThing`. It is not really an awesome name :),
but I think it sucks less than a thing called "common" which would be no
really longer any "common" in that new file.

Change-Id: Ifd85ea4423d418c14c8fae3d5054c5cb5638d283
2020-05-23 20:44:47 +02:00
Jan Kundrát
24bc023a07 refactoring: reduce amount of Python magic in network building
Similarly to I7ab9f64d7ac2042e8a16d031ba5562a6eb412471, it's better to
be explicit about what's going on. It makes it easier to reason about
the code.

Change-Id: Ic7f4a590567f1f5903222be8dae53521424d8f77
2020-05-23 18:14:41 +02:00
Jan Kundrát
19c2ae7f7a Qualify all usages of classes from gnpy.core.elements
This will be needed by one of the follow-up commits which move JSON
manipulation into a single file. We have to distinguish between "Roadm"
the JSON representation and "Roadm" the network element which propagates
spectrum.

Change-Id: I6888842c57c3a57849fabe75d0ff6f5bbfab426a
2020-05-23 18:11:08 +02:00
Jan Kundrát
9f6894a176 Remove extra indirection -- just instantiate Edfa directly
Change-Id: Ib957063d3333597fd58ba97bb78a187559f42d86
2020-05-23 17:52:39 +02:00
Jan Kundrát
a094568d6e equipment: move NF estimation into science_utils
I envision equipment.py as something which deals exclusively with the
traditional GNPy's JSON-formatted data, so make sure we do not include
any computation in that file.

Change-Id: I8473cccd84243147181a7195ba39fc6c9db3c42f
2020-05-23 16:37:56 +02:00
Jan Kundrát
f728d96d07 equipment: mark internal functions as such
Change-Id: I3cbed25568c0e85685caf3e279f3c3a8e37247f1
2020-05-23 16:37:56 +02:00
Jan Kundrát
6b1fb7061f Move placeholder EDFA NF calculation to where it gets used
I think that equipment.py should be only concerned with constructing
network elements from JSON data.

Change-Id: I777835b02a23b76fb1d40c3a966e72b606e9c205
2020-05-23 16:19:05 +02:00
Jan Kundrát
7ef505f259 Do not perform magic when reading equipment config
I think that being explicit about what gets instantiated is much easier
to read. For one, it enables "find usages of this thing" in IDEs.

The original code was also insecure because it would happily invoke any
function available in the global context with user-supplied data(!).

Change-Id: I7ab9f64d7ac2042e8a16d031ba5562a6eb412471
2020-05-23 16:07:48 +02:00
Jan Kundrát
c8d394348d Tweak error messages a bit
Change-Id: I49af7ea0d40a901cee271f21306288337f6b041f
2020-05-23 16:07:17 +02:00
Jan Kundrát
0daa1c3e8c Rely on pip for missing dependency detection
We have some docs on how to install this, and we rely on the
requirements.txt file for specifying what packages are required. There's
no point in arbitrarily handling the `xlrd` package in a different
manner.

Change-Id: I2355440ada7fd0cc4868aeff5a8956729655c96b
2020-05-23 15:45:10 +02:00
Jan Kundrát
60bafd114d examples: remove unused imports
Change-Id: I5c60243ecf7cdb56bf1060e2630e35ffb5f3548f
2020-05-23 15:33:18 +02:00
Jan Kundrát
11509f5686 requests: simplify ResultElement inheritance
This has been the only place where content from gnyp.tools.service_sheet
was being used outside of tests and examples. It looks like something
which is actually *not* used anywhere (the ResultElement instances are
only ever apended to a list which gets used as-is, so we do not need any
custom comparisons or hashing).

Change-Id: Ib6ddcf55779218d602620e77973d88ad62d0ec7b
2020-05-23 15:23:27 +02:00
Jan Kundrát
785c823fa2 coding style: separate property definitions from each other
While not strictly needed in Python, it's much clearer this way.

Change-Id: I5f8caccde6440cb4032aae5deae577d328834a75
2020-05-23 15:23:27 +02:00
Jan Kundrát
9faf6430a5 reorganization: gnpy/{core => tools}/service_sheet.py
Change-Id: I88559cc718536f222b8ea9829bcc72a425c062ca
2020-05-23 15:23:27 +02:00
Jan Kundrát
01c566a325 reorganization: gnpy/{core => topology}/spectrum_assignment.py
Change-Id: Ic6194ce639dcb2f9419372febe0f2b58473edb38
2020-05-23 15:05:42 +02:00
Jan Kundrát
baa9171315 docs: improve docs markup a bit
Change-Id: Ic1f60dbdfbe08111026cc3bdcd8d56293cb4555d
2020-05-23 14:59:35 +02:00
Jan Kundrát
2e50337f38 docs: remove implementation details from the documentation
Change-Id: Ic5ff20b16cc932953729f9cae2e8e3d96058e92d
2020-05-23 14:59:18 +02:00
Jan Kundrát
8daa298699 Import referenced exceptions
Change-Id: Iba0dc2682edc59e211bd8bbe638ce0bc1f4917fe
2020-05-23 14:58:49 +02:00
Jan Kundrát
76cdd5dc71 parameters: remove unused logger
Change-Id: I2d51081c8e1f7315861c2280ceb92304327c2ac6
2020-05-23 14:30:15 +02:00
Jan Kundrát
07eb2dd13a Refactoring: conversion functions instead of gnpy.core.units.UNITS
The TL;DR behind this patch is that it's better to have a utility
conversion function instead of having multiplier LUT and open code which
implements the conversion.

The FiberParams handling looked fishy -- apparently, it was keeping the
multiplier around, but it was unconditionally setting the units to
meters, anyway. Given that the units were not being preserved anyway
(everything got converted to meters), and that the multipler was not
used anywhere, let's refactor the code to just convert to meters using
our new utility function, and remove the unused argument.

Change-Id: Id886d409a4046f980eed569265baefd97db841bd
2020-05-23 13:50:25 +02:00
Jan Kundrát
04e764d024 FiberParams: print out the configuration upon failure
Apparently it's sometimes not obvious where the input data come from
(see next commit), so let's show the data which caused this excpetion to
the user.

Change-Id: Id333903a0549c4ef5dc37c2f6ff340bd357279ea
2020-05-23 13:50:25 +02:00
Jan Kundrát
05ccb14e5d Remove unused gnpy.core.execute submodule
Change-Id: I3972f8321547bc596c018fa04232edfa23b97581
2020-05-23 13:30:19 +02:00
Jan Kundrát
f60d035e66 Ensure that automatic_fmax gets used and not reinvented
Change-Id: I1f77f47eec3a3f0e6f22a073a98edca20958d321
2020-05-23 13:13:22 +02:00
Jan Kundrát
0823f8de46 Move automatic_nch, automatic_fmax into utils
I think that gnpy/core/equipment.py should contain only stuff which
prepares the equipment_config, not anything "lower level" that is reused
from other places.

Change-Id: I0cd593fd3e5558178ddd0ad8fff5c596e022894a
2020-05-23 13:08:24 +02:00
Jan Kundrát
a453c57996 document and test automatic_nch and automatic_fmax
Change-Id: Ib6a5e5f8278456725c810a9f7f0c7f5cafd78bd2
2020-05-23 13:03:23 +02:00
Jan Kundrát
2f84bb5286 Merge "Zuul: test with Python 3.6 as well" into develop 2020-05-23 09:03:20 +00:00
Jan Kundrát
7b8e68aea9 Zuul: test with Python 3.6 as well
We're saying that Python 3.6 is the minimal version that we support, so
let's make sure there's CI coverage for that.

I also tried enabling Python 3.8, but somehow the build of Pandas
failed. I don't feel like debugging a Pandas build failure today, so
let's postpone this thing until later. Just having 3.6 is a net
improvement, and we can play with even newer Python later -- and perhaps
on a newer distro, anyway.

Change-Id: I28a8c282225b7070ed3dddba56cccc8def313a77
2020-05-23 10:24:33 +02:00
Jan Kundrát
8d553a255f Merge "docs: brand the generated docs with our shiny logo" into develop 2020-05-22 13:04:13 +00:00
Jan Kundrát
2766e37438 CI: do not clutter up Zuul's failure display with the coverage report
Just ask pytest-gcov to not generate any test report. The coverage data
are still generated and will be used in later steps of the build
pipeline.

Change-Id: Ic5bb6a48e2abf6ee52e1c2650727ce4170611171
2020-05-21 12:52:19 +02:00
Jan Kundrát
f56e64410b docs: brand the generated docs with our shiny logo
Image source: I took the existing banner, cropped it and resized to
200px width as per the theme docs.

Change-Id: Ic79b6164d557298746fe878de31ee0a9b0d93923
2020-05-19 18:11:29 +02:00
Jan Kundrát
15ea7218e9 Merge gnpy.core.node.Node into gnpy.core.elements
That class is an internal implementation detail, so mark it with a
leading underscore as per Python idioms.

Also, tweak the docs so that there's less duplicate information and
more cross-references.

Change-Id: Ieb1c8034ab5b442032396d7c4bbd0a697c7eb492
2020-05-19 17:29:11 +02:00
Jan Kundrát
db28011c61 coding style: manual tweaks
This mainly reverts some auto-fix-ups done in
I2f0fca5aa1314f9bb546a3e6dc712a42580cd562 which do not make that much
sense. By reverting them by hand, it's (hopefully) easy to see what is
just a tool work and what is an opinionated preference.

Change-Id: I6cb479e34b552fadc85c41b4b06b24e60c87b4a3
2020-05-19 13:59:56 +02:00
Jan Kundrát
faccc23018 Always use our ansi_escapes module
Change-Id: I27bac671caa7544f51e7935ee120c9fcd6c942a7
2020-05-19 13:59:56 +02:00
Jan Kundrát
49514c0c70 flake8: fix F401 (unused imports)
Change-Id: I6f79f3a4c071b332e45033f4189a2af6c66a6e05
2020-05-19 13:45:05 +02:00
Jan Kundrát
0b1557fdf1 flake8: fix F841 (local variable is assigned to but never used)
Change-Id: Ic79d0189bf4e97b953604edd0a6932f28c71a071
2020-05-19 13:30:04 +02:00
Jan Kundrát
46f89aa770 coding style: autopep8 in an aggressive mode (-aaaaaaaaaa)
I decided to skip the following chunk of the diff because I think that
it would actually made the code a bit harder to read:

diff --git gnpy/core/service_sheet.py gnpy/core/service_sheet.py
index 9965840..9834111 100644
--- gnpy/core/service_sheet.py
+++ gnpy/core/service_sheet.py
@@ -41,8 +41,22 @@ logger = getLogger(__name__)

 class Request(namedtuple('Request', 'request_id source destination trx_type mode \
     spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
-    def __new__(cls, request_id, source, destination, trx_type,  mode=None, spacing=None, power=None, nb_channel=None, disjoint_from='',  nodes_list=None, is_loose='', path_bandwidth=None):
-        return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from,  nodes_list, is_loose, path_bandwidth)
+    def __new__(
+            cls,
+            request_id,
+            source,
+            destination,
+            trx_type,
+            mode=None,
+            spacing=None,
+            power=None,
+            nb_channel=None,
+            disjoint_from='',
+            nodes_list=None,
+            is_loose='',
+            path_bandwidth=None):
+        return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing,
+                               power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)

 # Type for output data:  // from dutc

diff --git tests/test_automaticmodefeature.py tests/test_automaticmodefeature.py
index 0e5f633..5ba5881 100644
--- tests/test_automaticmodefeature.py
+++ tests/test_automaticmodefeature.py
@@ -32,7 +32,26 @@ eqpt_library_name = Path(__file__).parent.parent / 'tests/data/eqpt_config.json'
 @pytest.mark.parametrize("net", [network_file_name])
 @pytest.mark.parametrize("eqpt", [eqpt_library_name])
 @pytest.mark.parametrize("serv", [service_file_name])
-@pytest.mark.parametrize("expected_mode", [['16QAM', 'PS_SP64_1', 'PS_SP64_1', 'PS_SP64_1', 'mode 2 - fake', 'mode 2', 'PS_SP64_1', 'mode 3', 'PS_SP64_1', 'PS_SP64_1', '16QAM', 'mode 1', 'PS_SP64_1', 'PS_SP64_1', 'mode 1', 'mode 2', 'mode 1', 'mode 2', 'nok']])
+@pytest.mark.parametrize("expected_mode",
+                         [['16QAM',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'mode 2 - fake',
+                           'mode 2',
+                           'PS_SP64_1',
+                           'mode 3',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           '16QAM',
+                           'mode 1',
+                           'PS_SP64_1',
+                           'PS_SP64_1',
+                           'mode 1',
+                           'mode 2',
+                           'mode 1',
+                           'mode 2',
+                           'nok']])
 def test_automaticmodefeature(net, eqpt, serv, expected_mode):
     equipment = load_equipment(eqpt)
     network = load_network(net, equipment)

Change-Id: I522c45c079b3a9540568657e2ae0a4bfc5fb1272
2020-05-19 12:53:11 +02:00
Jan Kundrát
3548ed74e2 coding style: autopep --in-place --recursive --jobs 4 --max-line-length 120 gnpy/ tests/
Change-Id: I2f0fca5aa1314f9bb546a3e6dc712a42580cd562
2020-05-19 12:40:00 +02:00
Jan Kundrát
145653df6e reorganization: gnpy/{core => topology}/request.py
Change-Id: Ib399549479b56634c681930aa444b657e5f58ca7
2020-05-19 11:56:02 +02:00
Jan Kundrát
c7589e0bca linters: remove unused variables
Change-Id: I16cb9fab7996efcd01eec1f5dee7be12adae43c2
2020-05-19 11:55:26 +02:00
Jan Kundrát
531810cc85 Remove unused class import
Change-Id: I3e0b7174f6ff60a632bbcde56178499755b8b7df
2020-05-19 11:55:17 +02:00
Jan Kundrát
63a6256b5e pep8: rename classes to use CamelCase
Change-Id: Ia696e05b72f1bc5feb570996f492042dafab262d
2020-05-19 11:54:27 +02:00
Jan Kundrát
c3febb6db4 gnpy/core/request.py: autopep8
Change-Id: I8790f67e6d7993b50d6fc0295f3a8a88439ac9ae
2020-05-19 11:54:27 +02:00
Jan Kundrát
8b1d8b3479 reorganization: XLS conversion goes to gnpy.tools
Change-Id: Ibbaddacd24a2d0f6f6d98fdc30d57da3be188338
2020-05-19 11:54:23 +02:00
Jan Kundrát
3168603908 gnpy/core/convert.py: coding style fixes
Change-Id: I760ff33813e916b33cb7cb3f6cef470200587acf
2020-05-19 11:49:55 +02:00
Jan Kundrát
a2128227bd gnpy/core/convert.py: run autopep8
Change-Id: Ib1a70a4f7bf8ab36aa1ec9265e2afaf69ca59d94
2020-05-19 11:49:50 +02:00
Jan Kundrát
376826b3ae Merge "Remove debug dumping of "city mapping"" into develop 2020-05-19 09:32:43 +00:00
Jan Kundrát
4b258cdf2e Merge changes from topic "namespaces-and-modules" into develop
* changes:
  docs: emphasise the API reference over bits which are duplicated in the README
  docs: remove list of authors
2020-05-19 08:01:41 +00:00
Jonas Mårtensson
fbdd132a3d Support propagation of single channel
It always seemed like a strange restriction to not allow this.

Change-Id: Ice3ed3ecc08f42b6ef8b74d4a6bc3b1794ff078a
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-05-14 21:14:12 +02:00
Jan Kundrát
eebcebb33d Remove extra elements from default EDFA parameters
Since Ia8c8d734c7045062ce123360f4a1432490384118, the lists no longer
have to have the same length.

Change-Id: I43ff10defa414cde0d6ff26a4d238b8680e39899
2020-05-14 16:43:51 +02:00
Jonas Mårtensson
20152036ff Allow different lengths of EDFA config lists
Currently interpolation of nf_ripple or gain_ripple fails if the length
of the input list is different from the dgt input list since the amplier
frequencies used for interpolation are calculated based on the length of
the dgt input list. There seems to be no good reason for this restriction
so I propose to calculate amplifier frequencies separately for the
different inputs. This also allows to specify a flat ripple with just one
number and a flat dgt with two numbers.

Change-Id: Ia8c8d734c7045062ce123360f4a1432490384118
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-05-14 15:30:57 +02:00
Jan Kundrát
2a477071a0 docs: emphasise the API reference over bits which are duplicated in the README
Also make sure that all modules are covered. It seems that there's no
automatic way for doing this, aargh. On the other hand, there's
apparently no need to repeat all the Sphinx markup blurb, and even
sectioning works nicely (read, "as I expect it to work") now :). I think
that it's still necessary to keep these "intermediate files" that only
trigger package-level and module-level autodocs, but hey, I can live
with this.

Change-Id: I705e0054cd7cd4350171770795d69f5c15c226d6
2020-05-07 19:54:55 +02:00
Jan Kundrát
f02d11e8bc docs: remove list of authors
The documentation is something which our users see as one of the first
things when they go and try GNPy. With all respect to people who have
contributed over the years, there are more important information to
convey to a first-time user instead of a list of authors.

Change-Id: Ibe5f6477f9736b9ab71effcf0eccef7c7fdfdde5
2020-05-07 19:05:50 +02:00
Jan Kundrát
0d542f22a7 Remove debug dumping of "city mapping"
This JSON file is never used by the rest of the code. Let's get rid of
one of these files which are put into the source code directory during
unit test execution.

Also remove other dead code; thanks to Esther for catching this.

Change-Id: I30a4e7edcf638162ec438fbf7f00d26d78944ac3
2020-05-05 19:20:00 +02:00
Jan Kundrát
5af195bd2b Remove unused imports
Change-Id: I66174048a9eaab0f79ba4c3b1d31ef4dc9c2009b
2020-04-30 17:30:55 +02:00
Jan Kundrát
7ab93e7cd9 tests: frequency to wavelength
I decided to keep it around because I know that some people would like
to see those nanometers. Let's make sure it works.

Change-Id: Ib279cc8380a77f478da7a2bbc1e045a718446404
2020-04-30 17:30:55 +02:00
Jan Kundrát
0aec47ddeb tests: remove dead code
Change-Id: Id8bc0cae7d91b2a80032890b2e30c70be566d052
2020-04-30 17:30:55 +02:00
Jan Kundrát
9a54dbab43 Remove unused functions
Given that everything else just uses these constants as imported from
numpy, there's no point keeping these wrappers around.

Change-Id: I0e19e05f40dc79d8005e915cf3ffb5e36328421a
2020-04-30 17:30:55 +02:00
Jan Kundrát
fc03be8bbe tests: use doctest instead of an explicit runner
...and expand the coverage a wee bit while we're at it as well.

Change-Id: I0de8445dc29f46e5f238bff0ca0e1f63fe19712d
2020-04-30 17:30:55 +02:00
Jan Kundrát
32f10a4507 tests: show how much *test* code is actually executed
Change-Id: Iba05c8b212b7217fe3d45a50f3d24f017ab09a74
2020-04-30 17:30:55 +02:00
Jan Kundrát
04544d41f6 Merge "Fix #353 - String representation of network elements" into develop 2020-04-30 15:29:07 +00:00
Jonas Mårtensson
c87be89e07 Fix #353 - String representation of network elements
Currently the string representation of some elements in elements.py refer to parameters that are not assigned until the propagate function runs.
Printing an element or trying to access its string representation before propagation therefore raises a TypeError.

This patch should fix the issue.

Change-Id: I29962f3c00e1f4fb7935535d4514a9579bc0c918
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-04-29 10:29:44 +02:00
Jan Kundrát
efa05f0653 Merge "Find paths with include nodes constraint" into develop 2020-04-28 16:42:10 +00:00
Jan Kundrát
4cf4db9bc2 Merge changes If9ebdfad,I9bc7e5bb,Ia7cf2026 into develop
* changes:
  include all 'no' Capital/non capital combination in test
  small refactor
  Enable ILA and FUSED type in routing constraints with xls input
2020-04-27 11:10:24 +00:00
Jonas Mårtensson
16434c5737 Find paths with include nodes constraint
The compute_constrained_path function in request.py handles include nodes constraint by first finding all simple paths shorter
then a cutoff length and then checking if any of the paths meet the constraint. The cutoff is currently set to 120 which is
too small for large networks. For the CORONET_CONUS_Topology.json topology, for example, the shortest path between some node
pairs is longer than the cutoff.

This is a small proposed change to first compute the length of the shortest path and then set the cutoff to at least 20% (a
conservative number) higher than this. I also propose to increase the minimum cutoff from 120 to 150.

Change-Id: I97ff2915fb38e4681bd64d60f2cd6dfd422afd4f
Signed-off-by: Jonas Mårtensson <jonas.martensson@ri.se>
2020-04-24 16:22:20 +02:00
EstherLerouzic
14ee9c9a91 Removing exit(1) and using exception instead
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I13bb8c87a7f764bda6e13f233c7b70909b9ee035
2020-04-24 10:41:17 +00:00
Jan Kundrát
7cfc4bd1ec linters: activate more checkers
"naming" with the "N" prefix looks reasonable, and it looks like my venv
has already had mccabe for that cyclomatic complexity. Let's get these
in.

Change-Id: Ie0cf4d8c20486f0f561db01903b3aa93cbebdb12
2020-04-24 12:22:32 +02:00
Jan Kundrát
9d55cde50d linters: Increase maximum line length
The code is already using long lines, so I think we can avoid a
discussion whether to raise this limit, and instead we can focus on
another discussion :) on what that limit should be.

Random googling suggests that GitHub's code renderer operates at 119
chars per line. A random style guide mentions going to 100 or 120. Prior
to this change, there has been 712 violations of PEP8 check E501.
Picking 119 would bring us down to 21 violations, and using 120 reduces
it further to 16. I think that 120 is a cuter number than 119, so 120 it
is unless someone objects hard enough to propose another cutoff in a
follow-up patch.

Change-Id: I57f48fcb6846fea35223daac91aa2e8c7afabc63
2020-04-24 12:21:18 +02:00
Jan Kundrát
72183c24da Merge changes from topics "coverage", "examples-via-tests" into develop
* changes:
  CI: run flake8 for coding style nagging in changes
  Fix syntax of the bandit config file
  Upgrade pytest, and list it as a test-only dependency
  CI: Activate coverage diffing
  tests: simplify tox setup
  CI: Run both non-coverage and coverage build/test jobs
  tests: Check if example code provides exact same output
  tests: Run examples via pytest
  Show more details upon a test failure
2020-04-24 09:32:03 +00:00
Jan Kundrát
4d1a628488 CI: run flake8 for coding style nagging in changes
This combines a few pieces of magic:

- Zuul merges speculatively, but in the git trees that the build VMs
see, the "current branch" is always the state as-if the currently tested
change was merged, while origin/"current branch" is the result of all
previous changes, if any, applied to the current tip of the target
branch.

- flake8 has a --diff option which appears to work correctly, *but* it
reports on all lines that are present in its input, including the
context -> hence the -U0 option.

Change-Id: I28403ff0132fac0b52696c82306732a4f81fa66a
See-also: https://gitlab.com/pycqa/flake8/issues/58
2020-04-24 00:14:02 +02:00
Jan Kundrát
fdeaf75361 Fix syntax of the bandit config file
Unlike codacy, it seems that bandit the CLI tool complains about these:

  Unable to parse config file: File contains no section headers.
  file: '/home/jkt/work/TIP/oopt-gnpy/.bandit', line: 1
  "skips: ['B101']\n"

Change-Id: Iab93052fd8aaf1754571a3c66796cfe3026f6a63
2020-04-23 18:38:36 +02:00
Esther Le Rouzic
5695fafac6 Merge "Adding a set of tests on spectrum_assignment functions" into develop 2020-04-23 13:17:31 +00:00
Jan Kundrát
0fe1d195a3 Merge "Update Excel userguide" into develop 2020-04-23 12:53:40 +00:00
EstherLerouzic
ed1aa0aa03 Update Excel userguide
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I9ef3d26c8363526d14fb00a17c31176925488abd
2020-04-23 14:45:39 +02:00
EstherLerouzic
3fc024f5ba include all 'no' Capital/non capital combination in test
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: If9ebdfad62d10b63856652d094ef459d68973f7c
2020-04-23 12:43:11 +00:00
EstherLerouzic
4f8177908f small refactor
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I9bc7e5bb171864ee385fb2e59631943f2a65d235
2020-04-23 12:42:59 +00:00
EstherLerouzic
ad64595d75 Adding a set of tests on spectrum_assignment functions
- test oms building:
  o there should not be roadm or transceivers between end points except endpoints
  o the element oms is must correspond to the correct oms id
  o the element uid in the oms must match the id list

- test the alignment function
  this function enables to have different grid end points in different parts
  of the network. The grid used in the network must cover all these grids, so
  some bit stuffing is needed on the oms that have different sizes.
  The test checks that
    o min and max attribute are correctly updated
    o min and max n or freq values are consistent and consistent with bitmap
    o alignment is correct

- test that the assignment of n and m values is correct
  o check that assign_spectrum has returned an error code if the requested
    assignment is not possible and that the bitmap has not been set to 0
  o check that the bitmap sum works correctly when assignment is feasible
    and that all range of spectrum has been set to zero. eg:
    [1 1 0 0 0 0 1 1 1 1 1 1] and [1 1 1 1 1 0 0 0 0 1 1] must be:
    [1 1 0 0 0 0 0 0 0 0 1 1]

- Check that spectrum assignment of 13,7 is correct in Hz

    This example has been extracted from ITU-T G694.1
    expected value in Hz for 13,7 is 193137500000000.0,193225000000000.0 in Hz
    see fig I.3 of this document https://www.itu.int/rec/T-REC-G.694.1-201202-I/fr

- test assignment limits

  o verify that inconsistent values raise error: ie with defined fmin fmax
    n and m have limited values.
    combine valid and non valid data for n and m
  o verify that Bitmap created with a 0/1 list is consistent with fmin,
    fmax, grid and guard band

- Test with a path configuration

  o loop on assignment on a given path. assignment should be OK until n = 96
    at that value the assignment is no more feasible (exceed grid) and the selection
    function should return None value for all center_n, startn and stopn
  o select an arbitrary request and try to assign 1 slot more than the whole
    spectrum or exactly the whole spectrum and check that function correctly
    return None or not None values

- Test assignment with reversed path

  o add data and requests fixtures to reduce test time
  o test that if spectrum is assigned on one direction it is also
    assigned on reversed direction (bitmaps must be identical)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Co-authored-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
Change-Id: I96dd15452cc2e59086d4022ec4b026be845f4355
2020-04-23 14:36:09 +02:00
Jan Kundrát
80133e9521 Merge "Better explain the source of error in exception message" into develop 2020-04-23 12:22:05 +00:00
EstherLerouzic
de58d7d7c2 Enable ILA and FUSED type in routing constraints with xls input
- builds correspondance dicts between input name from excel
  and names created with convert.py and autodesign in network.py
- correct the corresp_name dicts according to the effective
  network autodesign. This supports the case of fiber splitting
  and of fused elements
- include the case of parrallel links with only one hop
- interpret the node list constraint given by the user with the dict
- filter the constraints that are not applicable
- add tests for constraints
- correct equipment sheet of mesh_example_topologyv2.xls: morlaix and
  loudeac should not appear in node A column since they are fused

ILA and FUSED constraints must be filled with the next node
information in order to avoid confusion on the direction.
for example
     eg    a----b-----c
           |    |     |
           i    j     k
           |    |     |
           e----f-----g

a constraint 'j' given for service i to k leads to 2 possible direction:
i-a-b-j-f-g-k
i-e-f-j-b-c-k
the user must indicate the chossen direction. This ambiguity does not
exist with network input in json format (names are unique).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ia7cf2026e569c8b566459791fc89546b91fb117c
2020-04-23 11:15:37 +02:00
Jan Kundrát
491a05c5a7 Upgrade pytest, and list it as a test-only dependency
Change-Id: Id1cbfb88d3034fc01f302af53bbd689afe54ffa5
2020-04-22 14:53:48 +02:00
Jan Kundrát
1d791aa295 CI: Activate coverage diffing
Depends-on: https://review.gerrithub.io/c/Telecominfraproject/oopt-zuul-jobs/+/489964
Change-Id: Icce2c2b11f4253e2400dcfe0055373f9d6bc7d14
2020-04-22 13:18:46 +02:00
Jan Kundrát
58921bc346 tests: simplify tox setup
I was not happy with duplicating the "test command" section among
several tox test environments. I was not successful when using just
`.coveragerc` or via `addopts` in `[tool.pytest]`, but these
conditionals appear to work.

Also replace `pip install -e .` or `python setup.py develop` with tox'
native `usedevelop`.

Change-Id: I4986a3d14f38b7f6ed9b6a04f773eb222a53a827
2020-04-22 13:18:46 +02:00
Jan Kundrát
ab6a91692b CI: Run both non-coverage and coverage build/test jobs
Change-Id: Ia58112b745a0b34d2dcef782ad729f165629eb22
2020-04-22 13:18:46 +02:00
Jan Kundrát
3b45968799 tests: Check if example code provides exact same output
Change-Id: I5938f85337e4254092683dadc806a0a419cb2a04
2020-04-22 13:18:46 +02:00
Jan Kundrát
c7d69b9a99 tests: Run examples via pytest
This will make it simpler to update coverage info. The pytest-cov plugin
that we're already using apparently makes this behavior supported, nice.

Change-Id: Ieafc0da99a8c325f5f2286ed11e66069e244e43b
2020-04-22 13:18:46 +02:00
Jan Kundrát
1eeed78430 Show more details upon a test failure
With -vv, we can at least have access to the full output -- if not in
its prettiest form.

Change-Id: I950bb4c2f59a9f582e53e48c78458bb47ab7d832
2020-04-22 13:18:06 +02:00
Jan Kundrát
1cdafbae0f docs: explain what the roll_off parameter means
Co-authored-by: Alessio Ferrari <alessio.ferrari@polito.it>

Change-Id: I4930dc5fdd9d4f60cef753f2d2e9674dce8aeb52
2020-04-21 21:27:14 +02:00
EstherLerouzic
47b4f87bc5 Better explain the source of error in exception message
If user forgets to fill in the path_bandwidth, error message was
not explicit enough to help to correct.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I776e95b6f859c3b643786b6e802f3c0bcbc1de76
2020-04-06 11:15:36 +02:00
Jan Kundrát
e24b766cdc Merge "Add an explanation for delta_p in excel user guide" into develop 2020-04-02 12:02:05 +00:00
Jan Kundrát
51fb6bd68e Merge "Correct the unit of gamma in Readme file" into develop 2020-03-30 13:37:03 +00:00
EstherLerouzic
7e405a0514 Correct the unit of gamma in Readme file
The units can be found in several papers from the documentation
for example
A. Carena, G. Bosco, V. Curri, P. Poggiolini, M. Tapia Taiba, and
F. Forghieri. Statistical characterization of PM-QPSK signals after
propagation in uncompensated fiber links. In European Conference on
Optical Communications, 2010, 1–3. IEEE, 2010-09.
URL: http://ieeexplore.ieee.org/document/5621509/,
doi:10.1109/ECOC.2010.5621509.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: If70b57a28ee108c4b8dbe341075e12169efbe14c
2020-03-30 15:29:29 +02:00
EstherLerouzic
194a13e607 Add an explanation for delta_p in excel user guide
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: I0e3380abd494dd99a30bee00b55ec427d48c45ef
2020-03-30 15:22:43 +02:00
Jan Kundrát
856f07e707 Merge "Zuul CI: remove manual workaround for Python3 and TOX" into develop 2020-03-30 13:19:01 +00:00
Jan Kundrát
4582aed895 Merge "docs: Fix all sphinx warnings" into develop 2020-03-30 13:17:44 +00:00
Jan Kundrát
734b6dfef4 Zuul CI: remove manual workaround for Python3 and TOX
Now that https://review.opendev.org/712804 has been merged, we can just
use upstream jobs without that extra workaround.

Change-Id: Ia113caaea41bdd440dcb615c02ad95eba6da7543
2020-03-27 16:06:06 +01:00
Jan Kundrát
da6e4f33a4 Merge "Fix dependency installation with PBR" into develop 2020-03-27 15:04:10 +00:00
Jan Kundrát
5a1e3f30b3 docs: Fix all sphinx warnings
...and also enforce a warning-free build within the CI.

Change-Id: Ia406a0a1ca2e89ceaa0288ae82128fa9427fe066
2020-03-27 15:07:19 +01:00
Jan Kundrát
094af16792 CI: build documentation, too
I've bumped Sphinx because this might be needed for proper support of
pbr-provided version numbers. In the end I also had to re-run `pip
install -e .` locally, quite likely due to a different format of version
numbers now that pbr is being used, but let's make sure that we're using
a version of Sphinx that is now known to work.

Change-Id: I451b4b17bb8097eb7f2f0f0956cc1c956a531828
2020-03-27 12:46:53 +00:00
Jan Kundrát
9474ff17de Fix dependency installation with PBR
Ouch, this one hurts. It turns out that PBR-based setup does *not*
install all dependencies when running `python setup.py install`. On the
other hand, running any of:

- `pip install .`
- `pip install --editable .`
- `python setup.py develop`

...makes everything work. So, let's change the instructions and all the
build scripts (including the Docker file) to make sure this thing still
works. Sorry for noise.

This is a significant change, it means that people will have "in-place
installations" of GNPy. Changes to their git checkouts, if used, will
apply, etc. I think this is actually a good change.

fixes #287

TL;DR: package installation with Python is still a mess.

Change-Id: I422b889b599e0b7cae36f160d1548cef7fb50a4e
2020-03-27 13:41:55 +01:00
Jan Kundrát
c675f5bd38 docs: do not reference a non-existing directory
Change-Id: I197956d3b55f0ebcf20cee54d5f2887bbd821fa3
2020-03-25 19:34:42 +01:00
Jan Kundrát
2556658e68 CI: build and test this under Python 3.7 and Vexxhost's Zuul
Change-Id: Ifa849ab8182596dc0285c18365ead806ea9d0bc9
Co-authored-by: Mohammed Naser <mnaser@vexxhost.com>
2020-03-25 19:34:42 +01:00
Jan Kundrát
4d5d10935a Convert to pbr setup
It turns out that our current setup does not really support the `sdist`
Python packaging step. I'm trying to increase automation, both for
making releases for upload to pypi.org, and also for CI via Zuul. As it
turns out, Zuul comes with a set of predefined jobs which -- by default
-- use `tox`, and tox was having troubles dealing with our `setup.py`
because it also assumes that the `sdist` step works. It is also supposed
to:

- fix version number duplication in `setup.py`,
- fix version number duplication in Sphinx docs,
- prevent the need to write a `MANIFEST.in` manually.

TL;DR: insetad of having to deal with a ton of other creative
boilerplate, use tools for lazy people like me, and PBR ("Python Build
Reasonableness") appears to check the marks here.

Change-Id: I27c36c4f6b0e76857d16f7819ec237e9b811476a
2020-03-25 19:34:42 +01:00
Jan Kundrát
eb87e36781 Configuration for git-review
We're still primarily a GitHub project, so it is necessary to add a
pointer for git-review to point to GerritHub (which is our Gerrit
hosting site).

See-also: https://docs.openstack.org/infra/git-review/
Change-Id: I6fe8da2358ae50340fbe384b038ab097ab229b59
2020-03-25 12:50:23 +01:00
Jan Kundrát
14d8d793f3 Fix test failure due to mishandled merge
In commit 80eced8, the structure of parameters to `elements.Fiber` was
changed. Options such as fiber length are now passed in via
`self.parameters.*` instead of `self.*`. Commit 639b379 which fixed a
test failure precedes that change, and when we merge both as I did in
commit bc4b664, the test no longer works. My bad. On the other hand,
this will be caught by trunk gating which is something that Zuul can do,
and therefore something that we'll have in our upcoming CI, yay!

Fixes: bc4b664
Change-Id: Ifcd8f0bf01e9d91dbef3da1aa7f56f89132d6f48
2020-03-24 19:50:24 +01:00
Jan Kundrát
bc4b6642a0 Merge pull request #340 from Orange-OpenSource/correction_test_amplifier
Correction test amplifier

Fixes #324, #321
2020-03-24 16:59:30 +01:00
Jan Kundrát
fe2b39ee3b Merge pull request #342 from jktjkt/pr-337-339
Cleaned-up PRs #337 and #339 which are related to parsing of `STRICT` path requirements in service requests.
2020-03-24 13:51:59 +01:00
EstherLerouzic
d56c91ca7f small linter fixes
(Jan: cherry-picked relevant parts of c6be21f36f)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Change-Id: Ib84f452a15a20a88ea186ceef64712eb967d46a7
2020-03-24 12:07:33 +01:00
EstherLerouzic
25819fadf5 refactor test and add test on constraints
(Jan: cherry-picked relevant changes from commit
44e8936f04d6eb334cea2e5643d8ddceb6a0a5a8)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
Signed-off-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
Change-Id: Ibeeaa42b35e23075618c8543a857f47fac48df27
2020-03-24 12:05:56 +01:00
EstherLerouzic
0cacb8851d Fix node_list and "STRICT" checking
The check on node_list constraint has to be be done on all elements except the last one,
and the last one is always a "STRICT" value. this means that if no constraints are given
this list is equals to ["STRICT"]. Previous code was using len("STRICT") ie 6 instead of
len(["STRICT"]) ie 1.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-03-24 12:00:20 +01:00
Jan Kundrát
14f98836ed Merge pull request #341 from ojnas/import-xlsx
Enable import from .xlsx files
2020-03-23 11:17:34 +01:00
Jan Kundrát
82ce3384ba tests: Use XLSX for the CORONET topology
Now that XLSX is officially supported, it should be tested as well.

Change-Id: I9286d3cb56950d554582d2eaf8153da5ffdd771a
2020-03-21 13:19:24 +01:00
Jan Kundrát
1f1877f7a9 tests: do not hardcode file suffix lengths
This will become useful for XLS -> XLSX conversion.

Change-Id: I025e4c24d00526d3bb48c23dcbdc82a65be9a477
2020-03-21 13:19:24 +01:00
Jonas Mårtensson
db26ce07db Enable import from .xlsx files
Fixes issue #336
2020-03-20 18:17:02 +01:00
Jan Kundrát
e08ae9c959 Merge PR #329 into develop
This is a continuation from #319. The singleton-ish interface has not
been changed yet.

Change-Id: I93c8f1145561184f6e91f7e8f7debd3b48936205
2020-03-19 17:15:14 +01:00
EstherLerouzic
2de1b5567a updating test_compare_nf_models to correctly compare variable gain with polynomial
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-02-24 09:53:08 +01:00
EstherLerouzic
639b379a5b Updating ase test to handle the variable gain setup
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-02-21 16:23:11 +01:00
Mohammed Naser
0465397b1d Add empty Zuul configuration
In order for Zuul to start self-testing, it must actually have
a project entry listed.  This is the initial commit.

Change-Id: I017cb036f3191e46446d82b96a6acd35f2adcd0e
2020-01-28 20:20:41 +01:00
Jan Kundrát
d3ec39d506 Bump version to 2.1
Change-Id: I9f27fc87c5ca43e473fe212d2ee3dad7b12d4061
2020-01-15 00:09:48 +01:00
Jan Kundrát
bfe68a5948 Merge branch 'develop'
Change-Id: If7860b243cb504613a7fffafad2d601510000af7
2020-01-15 00:09:18 +01:00
Esther Le Rouzic
2ea3363613 Merge pull request #331 from Orange-OpenSource/pr_test
update version setup.py
2020-01-14 13:02:32 +00:00
DELFOUR Emmanuelle TGI/DATA-IA
89cce6e6a3 update version setup.py 2020-01-14 10:22:21 +01:00
Jan Kundrát
0f10ac706c Merge pull request #267 from Orange-OpenSource/capacity_planning-part2
This brings in the concept of OMS (so far created implicitly based on the input topology), spectrum assignment, etc.
2020-01-13 09:30:45 +00:00
EstherLerouzic
66d26f0ffa add the missing else to handle non first_fit policy
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:46:24 +00:00
EstherLerouzic
3c96914482 replace todo with TODO
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:41:40 +00:00
EstherLerouzic
61b1e73362 remove unused testing piece of code
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2020-01-10 18:40:18 +00:00
Jan Kundrát
03435079cc refactoring: simplify code flow a bit
- no need to explicitly log exceptions that are about to be raised
- kill some extra commented-out prints

Change-Id: I73bae5a2456644c4d4ff45bd984d44c27bc22ec4
2020-01-10 18:17:44 +00:00
Jan Kundrát
6661907c1d fix typos
Change-Id: I0a4d2c14c5e873dd521736525bb9b10c9b70975b
2020-01-10 18:17:17 +00:00
Jan Kundrát
fe811f725c tests: use native pytest features for exception handling
Using `with pytest.raises` is better than open coding the equivalent
feature. Similarly, when a block is not expected to raise an exception,
let's just let it run outside of a `try` block and rely on the test
framework to report a possible failure when hitting an unhandled
exception.

Change-Id: Icb1bb83e649733b56fcdc9168cabf88c9cf8d478
2020-01-10 18:16:53 +00:00
AndreaDAmico
80eced85ec Refactoring with some incompatible changes
Please be advised that there were incompatible changes in the Raman
options, including a `s/phase_shift_tollerance/phase_shift_tolerance/`.

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
Co-authored-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
2019-12-17 11:51:09 +01:00
AndreaDAmico
2960d307fa Unrelated changes
linter changes

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-12-17 11:13:00 +01:00
AndreaDAmico
c41cddfff5 Introduce new parameter structure for all Fibers
The Polito team wanted to have a single object with all parameters
required for each `Fiber` instance -- apparently for better code
readability.

Signed-off-by: AndreaDAmico <andrea.damico@polito.it>
Co-authored-by: Alessio Ferrari <alessio.ferrari@polito.it>
Co-authored-by: EstherLerouzic <esther.lerouzic@orange.com>
Signed-off-by: Jan Kundrát <jan.kundrat@telecominfraproject.com>
2019-12-17 11:12:52 +01:00
Jan Kundrát
8598e6591f Merge pull request #304 from Orange-OpenSource/user_error_catching_improvment
Give more useful comment for user to correct topology
2019-12-12 16:11:54 +00:00
EstherLerouzic
5e2259062c Give more useful comment for user to correct topology
when a node name used in link does not correspond to listed
node names, Bad link msg was thrown without info on which link
causes the problem.
This change gives additional info on link and catch the user error
more cleanly.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-12-12 15:55:33 +00:00
Alessio Ferrari
d483802a86 Edn-to-end test for Raman amplification 2019-12-12 12:52:59 +00:00
Jan Kundrát
9a0eece69c Merge pull request #322 from jktjkt/master
docs: Link to the updated meeting event
2019-12-09 20:46:06 +00:00
Jan Kundrát
1657bfd05f build: workaround Sphinx doc build failure
Upstream bugreport at sphinx-doc/sphinx#6887 suggests that this is due
to pip picking up the 0.16b0.dev0 pre-release...

Bug: https://github.com/sphinx-doc/sphinx/issues/6887
2019-12-09 16:05:56 +01:00
Jan Kundrát
49bf558916 docs: Link to the updated meeting event 2019-12-09 14:31:59 +01:00
Jan Kundrát
99f44a597b Merge remote-tracking branch 'origin/develop' 2019-11-13 19:59:35 +01:00
Jan Kundrát
a21f3fe6ee Merge pull request #312 from jktjkt/fixes
Small refactoring for itufs/utifl
2019-10-16 05:38:37 +00:00
Jan Kundrát
0ccbb2960c Merge pull request #277 from Orange-OpenSource/capacity_planning-part1
Capacity planning part1
2019-10-16 05:36:56 +00:00
EstherLerouzic
c577a75725 second set of modification due to codacy report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8827e0cf6f Codacy report: minor changes (trailing spaces, ...)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
b0012fe399 SA: process also the case when pth is empty
TODO: a path containing only transceivers and no roadms leads to no oms
this has not been properly taken into account. (single link w/o ROADM
has no SA complexity and is out of the scope)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
31e634615b Complete test on response comparison
If request is bidir and 'z-a-path-metric' is missing, raises an error
If present, should not raise an error

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8300a55e39 Update test file with correct SA (due to bidir correction)
reversed path must be taken into account for spectrum assignment

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
5b939bc57a Improve compare_response function
remove some prints, indicate which from actual or expected key
is printed.
preceise that test is also performed on values

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
2f1ab9cc50 Create a SpectrumError(Exception)
Create an exception in case of an error due to spectrum problem
eg if spectrum request is not correct
if values are not correct.
use it instead of exit

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
42ba3eb98d Add a test to go through the new added code in spectrum_assignment
Verify that there is no raised error if M=0 and the blocking attribute is there

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
9eb87fc8e1 raise error instead of exit in assignment
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
8fab9bb945 Raise exceptions instead of exit()
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
1ead232a78 Do not block computation if M=0
When a request is blocked and its M value is set to 0. This should not raise
an error in pth_assign_spectrum. This test verify that the function
does not raise an error.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
b15c8c60ab Add a DisjunctionError exception and use it
in case there is no possible disjoint path with the added constraints
(most of the time due to an inconsistant user request)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
66bdeb0e4d Catch Service error exception in the main program
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
1a2e090104 Move all main program into a main function in path_requests_run.py
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
a8e280e29b Remove exit and use ServiceError exception instead
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:19 +01:00
EstherLerouzic
edb54b02ac Another time improving formatting (codacy report)
removing unused import
change short variable names to conform to [a-z_][a-z0-9_]{2,30}$
change main variable names to conform to [A-Z_][A-Z0-9_]{2,30}$
add or remove spaces
add docstrings
correct comments and indents of cut lines

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-10 09:02:02 +01:00
EstherLerouzic
83d3f32fe0 Precise type of exception (codacy report)
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
085a379592 improving formatting (codacy report) of path_requests_run, request and service_sheet
remove unused imports
add docstrings
conform to '[A-Z_][A-Z0-9_]{2,30}$' pattern in main
remove trailing spaces
add/remove extra spaces

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
37bd5d0404 Code coverage improvement test service creation w/o sync vector
add one service  only excel file + changes on compare.py to support
no synchronization vector case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
f788b81d21 Code coverage improvement test json generation with Z to A direction
add aa bidir request to test service error handling
TODO: write more detailed test on the bidir case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
2ff1ce6b34 Code coverage improvement test ServiceError handling
For this purpose we create a wrong request with M=0 and verify
that serviceError is correctly raised when calling Result_element class

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
41a1e40d14 Update test data files
- add the bidir information on json expected services,
- add the label object in json responses
- add the spectrum on csv responses
- add the no-path container when relevant

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
921e8d2d3c Call build_oms_list in test_disjunction
Call bbuild_oms_list in the test to correctly build all attributes
of network elements

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
c009d28f7d Update test_disjunction and test_automaticmodefeature
Add bidir argument on load_requests calls

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
898eada097 Update test_parser
function compute_path_with_disjunction needs an additional reversed path
and results must contain wavelength assignment

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
bdfc55e801 Add a ServiceError(Exception) for malformed user requests
For example requested bandwidth should be >0

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
57f264bedb Change name of br and pw variables to brate and pwr
To conform to [a-z_][a-z0-9_]{2,30}$

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
fbe4fa3cf0 add docstring to functions
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
b2ef345f35 Cut long lines over 100 char
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:17:35 +01:00
EstherLerouzic
471ea7dfba Correct indentation due to linter report
using pylinter3
recommends having cut lines with same indent as starting [ or (

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
1b52f638ff other time: Correct indentation due to linter report
using pylinter3
recommends having cut lines with same indent as starting [ or (

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
84ab38a75f Another time: removing extra spaces, trailing spaces and adding missing spaces
for codacy report:
     removing extra spces before , and :
     adding spaces after , :
     adding spaces around < > =

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
916e5377f8 change variables d, el to conform to '[a-z_][a-z0-9_]{2,30}$' pattern
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
534bfd881e correction of the assignment: SA must be bidir
reversed path must be computed even if bidir is not requested
because in WDM system service are all bidir.
bidir option is only to avoid lengthy propagation of reversed
path when it is not needed

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
7c4015324d Moving reversed path computation and propagation to compute_path_with_disjunction function
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8499ee52f4 Removing extra spaces, trailing spaces and adding missing spaces
for codacy report:
 removing extra spces before , and :
 adding spaces after , :
 adding spaces around < > =

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
cc1123863c Cut long lines to 100char max due to linter report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
ca382806f6 Reordering imports due to codacy report
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
3559fc61c2 Introduce reversed path on CSV output
- use a new jsontopath_metric function to collect metrics
  out of a json response.
- path metrics must be shown also for reversed path if this
  is requested. This function avoids repeading code here
- add reversed metrics on the last columns of the CSV

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
33581cdcc9 Add bidir as a parameter in path_requests_run.py
if --bidir option is used on the main program, the field is set
  to true for all demands in case demands are expressed in an excel
  sheet. --bidir option does not change bidir field if the service
  file is in json format.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
991eb02964 Add the bidir parameter in transmission_main_example.py
Update the program because the same Path_request class is called
that now includes this bidir field

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
286e321a2d Add "bidirectional" field in the json service file example
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
56f158113d Implementation of bidirectional field per request
- Instead applying bidir option independantly from service demands (json or xls)
  the "bidirectional" attribute is introduced per request in the json.
  This enables bidirectional option per requests.
  if --bidir option is used on the main program, the field is set
  to true for all demands in case demands are expressed in an excel
  sheet. --bidir option does not change bidir field if the service
  file is in json format.
  Default value of "bidirectional" attribute is False.
- As a result the reversed path is propagated only if the birectional
  field of the request is True. (remember that the reversed path must
  be computed whatever the option because it is needed to compute
  spectral occupation on both directions).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
024f6ff963 Simplification de stdout
all BLOCKING_NOMODE requests have the same type of response
so the test is simplified to account for this

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8118a0f4f4 Second minor cosmetic changes
- add some help input for the main program arguments
- move and correct comments
- add empty lines on stdout to have a nice printing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
eb89d8fd86 Minor cosmetic changes
correct typos in comments and precise that results show the
mean value of SNR of all channels on stdout

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a938c1738b add reversed path information on stdout
the information on reversed path snr is shown in parenthesis

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
4f88882513 Use reversed path as a constraint for spectrum assignment
in path assignment function, path elements and reversed path
elements are concatenated to compute the overall spectrum
availability on all elements

in main program, assignment is performed after computing reversed paths

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
4e8d8b7ddd adding reversed path in the main program
add a bidir argument to make bidir propagation as optional.
Reversed path computation is not optional because it is needed
for spectrum assignment.
for all requests, if a path could be computed  a reversed path is
computed and propagation is performed on it if bidir option is on.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
afb7d75749 Add the reversed path as an attribute of Result_element
the objects contains also the information of reversed path if
the path is not None

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
488d0e1fe8 Change find_reversed_path function using the reversed_oms
reversed_oms was introdused in order to identify which oms correspond
to the reversed direction of a given oms.
with this commit we make use of this functionality and avoid the cumbersome
way that recovered the reversed path by computing shortest path in the
reversed way for each roadm. This function could not support multiple links
in parallel between two roadms, and was adding computation time.

The function first lists the oms of the pth in the reversed order and then
appends all its elements to the path. the first and last elements are transponder
and are append separately.

Unidir topology are not supported: if there is no reversed path, this raises an error

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
708442e4cd Introduce a reverse_oms function to identify bidir oms
reversed_oms is introdused in order to identify which oms correspond
to the reversed direction of a given oms. Indeed for spectrum assignment
it is mandatory to mark spectrum resources occupation on both directions
(requests are supposed bidirectional in WDM).

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a2d905dfb1 Add N and M info in the stdout
the couple (N,M) is displayed only when a request is not blocked

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
d564fe3e2a Add N and M info in the CSV
The couple (N,M) is added on the last column only for non blocked requests
an empty string is used instead

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
ea21cce1c0 Add N and M in the json response
The label object is added in the response. It contains center index N value
and number of slots M, required by the request according to the computation.

The label object directly follows the hop attribute as detailed in
draft-ietf-teas-yang-path-computation.

If the path is not blocked this changes the index of the last hop information
(-3 instead of -2) and the index of the transponder for the first hop
(2 instead of 1)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
aa9b4aefbe Add the N and M information in request when spectrum is assigned
N is the center frequency index (on G694.1 grid) and M the number of slots M,
required by the request according to the computation.
for convenience, we use N and M = 0 when request are blocked. Note that N= 0 is
a valid index when M is not 0.

If the number of slot required by a request is not feasible, the request is marked
as blocked with 'NO_SPECTRUM' as blocking reason

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8655030e59 Add blocking reason on stdout
previous way only check path existence to know if the request was blocked
or not.
Now the printing checks if the blocking_reason attribute exist, and if so
adapts the printing accordingly. The reason for blocking is added on the
output.

if no path could be computed, snr, osnr and other metrics depending on path
are replaced by empty strings
else, the metrics corresponding to the computed path are shown

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8107ddeb79 Change the processing of blocking case in main program
In compute_path_with_disjunction, and in case user mode is not feasible
returns the blocking reason instead of an empty path.
If the user does not give the mode and the automatic selection does not
give any feasible mode instead of checking if a mode exists, the function
now checks the presence of a blocking reason.

if the blocking reason is among BLOCKING_NOPATH reasons, than an empty
path is returned
if the blocking reason is among BLOCKING_NOMODE, then a path could be computed
and the mode information correspond to the last explored mode.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
76c8e55f06 CSV generation is adapted to include blocking info
creation of a function to avoid code duplication: json_param creates
the relevant parameters to show on the csv based on json input

replace try/except by a test on keys:
previous way tried to get pth_el['no-path'] and is the path was
not blocked this raised a key error. Now the there is a simple check if
the key is present.

Besides, as the no-path has been change to 'no-path' container containing
a 'no-path' attribute with the blocking reason, the test is made on the
attribute so on pth_el['no-path']['no-path'].

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
a7b1ab47d8 adapt response from the compute_constrained_path to return info in case of blocking
if the path is empty NO_PATH reason or NO_PATH_WITH_CONSTRAINT reason
is returned in the blocking_reason attribute

if no mode is feasible, the last explored mode is returned with the path (and
implicitly the last computed SNR). The baud_rate is derived from this last
mode

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
879f587ab9 Adapt json response to include information in case of blocking
if path could be computed: gives details of the path and on the propagation.
  - the 'no-path' attribute is changed to a 'no-path' container that contains:
      o the 'response-id' attribute
      o a 'no-path' attribute with blocking reason
      o if a path could be computed, the 'path-properties' of the path that
        was computed with the metrics

Note that this proposal to add information for blocking in the json output (instead
of a bare NO PATH) corresponds to the way PCEp is working in general, but is not yet
integrated in draft-ietf-teas-yang-path-computation model. Returning the whole path
in case of blocking in addition to blocking reason is a novelty from GNPy and was a
request from the users
        TODO : use correct ietf model when ready

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
8af2d80219 Add reasons for blocking in the response
Before continuing on spectrum assignment we need to clearly identify blocking reasons:
implicit previous way (a path is missing) is not satisfactory because in case of
spectrum blocking a path was possible. So we introduce a 'blocking_reason' attribute
on requests, that will only exist if the path is blocked during the process.
The 'blocking_reason' attribute refflects the blocking reason.
Commit defines blocking types and group them depending on the existence of path or
feasibility:
    'NO_PATH': no path was computed,
    'NO_PATH_WITH_CONSTRAINT': no path was computed with this constraint
    'NO_FEASIBLE_BAUDRATE_WITH_SPACING': no path was computed due to the spacing constraint
    'NO_COMPUTED_SNR': the computed path could not give any SNR result
    'NO_FEASIBLE_MODE': the user let the program choose a mode and path was computed but
                        no mode was feasible for the set of constraints
    'MODE_NOT_FEASIBLE': the user imposed a mode, a path was computed but this mode
                         is not feasible
    'NO_SPECTRUM': a path, a mode were selected but there is not enough spectrum available on
                   this path

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
315eea1f55 Calling assignment in main program path_requests_run.py
function is called to assign spectrum to each request
result shows an additionnal column for blocking reason

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
b61e541e15 pth_assign_spectrum function assigns wl for all demands
this is a first function that assigns spectrum following the order
of requests according to the selected mode and nb of channels computed
based on requested path_bandwidth

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
81f88e78c7 Creation of the OMS structure to record spectrum assignment
oms_list contains the list of OMS and each OMS contains the list
of uid of each crossed elements (ordered)
each element is updated with the oms_id to which it belongs
each oms contains a bitmap with frequency slots according to
frequency min max defined in eqpt_config.json (SI) and in case oms
are defined elsewhere, there is an alignment of grids to ease computation
the build_OMS_list function builds the OMS list and implements oms attributes
in all network elements
Commit also contains basic functions to handle spectrum bitmap and indexes

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:16:36 +01:00
EstherLerouzic
87cc3dac00 Corrections with respect to second review
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 19:09:39 +01:00
EstherLerouzic
89e28cc7be Correct bug in parser: csv header comparison
previous comparison was done on the result from .sort(), ie None.
list.sort() method modifies the list in-place and returns None.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
2ba29a78c5 Bug: constraint not correctly interpreted
if name of the constraint input from user is not part of networks names
(typically in the case of excel input), then the program try to find a
name that is clode to the user name and  that is in the network list of
names. This list must not include trnasponder names (because transponder
end points are already listed as constraints and transponder in the
middle of a path are not supported yet)
This will be improved in the PR Ila names in constraints #278
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
f990a6c1be Correct some remaining strict loose into STRICT LOOSE
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
424e5a4786 change variable n to nel to conform to '[a-z_][a-z0-9_]{2,30}$' pattern
from codacy report

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
6a7a04ebb1 syntax correction
reordering imports call according to pylint3 + extra line removed

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
0366fc2956 Adding a no-path case for test coverage
add a no path case (request 6) in requests and expected responses.
response is also generated if path is not feasible: checks
that it is correctly handled in csv and json responses

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
48b7d71f02 add a test on json response generation
create a json response based on test file and compare it to expected
response.
comparison first checks that there is no missing or extra key
and then compares keys

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
715baf2a1c correct compare response header test to support any order
previous test assumed same order for header fields.
order the headers in the same way in order to support different
types of order

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e55cea776e improve code quality
remove unused import, use correct indents, remove extra spaces
add function description

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
b388d143fd change assert to raise AssertionError
use
  if not condition:
  raise AssertionError()
instead of
  assert condition
according to codacy recommendation

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
c592c572d8 adding a test for the csv creation
using pandas package to have an easy ordering of response column:
- test that the generated header is as expected
- read the response. In order to support different orders wrt
  response answer and field answer, test function frst orders lines
  according to response index and then to columns (fields of the answer).
check that the answers are as expected

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
dfa0a26a28 changes to improve quality
minor name refactor
indent corrections
minor fixes for spacing

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
609cd94798 Remove @ in path-metric names and extend type to decimal64
@ character not correctly read with OpenDayLight yang tool used
for transportPCE project.
https://docs.opendaylight.org/en/stable-nitrogen/developer-guide/yang-tools.html#working-with-yang-model.
Changed the names of path metrics from osnr@ to osnr-

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
022f743db1 Change the exception handling if sync vector is absent
previous try section encompass errors that should not be silently
ignored. Correction pointed a default in a test file.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
1957beb1b6 Remove the filtering on transponders for correcting explicit route
anytype supported including transponder in order to accept well formed
route list from user (if user enters 'trx Lannion_CAS' this should be
accepted even if it repeats the source name)

A later PR better handles route list names

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
9ca72d6105 Correct csv creation in case of no path
previously reported the requested bandwidth, now fill it with an
empty string, same as the other fields.

This will be updated in a later PR when different kind of blocking
will be supported

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e8e126a6ce update existing test files and examples according to ietf yang model
- 'loose' and 'strict' changed to 'LOOSE' and 'STRICT'
    - n, m changed to N, M
    - unused objects removed
    - ...
Also correct templates for service and response

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
7849782173 Change content of source and destination to transponder end-points
previously contained the user-given source name in excel file,
but could differ from effective transponder source/destination

now both source and src-tp-id contain the same info (gnpy does
not make difference between node and ports)

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
149a0da8c9 update csv creation according to all model changes
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
1e7c70a59b remove the restriction on fiber for the constraint
restriction was placed on fiber type because when using excel input
autodesign create fiber names with syntax not explicit to the user, so that
entering a node name as constraint may end up to a wrong interpretation eg
aa -bb  -> create names such as 'fiber aa to bb', fiber bb to aa' .
If user constraint is bb there is a ambiguity on the fiber direction.
However this is not a problem for user using json format.
So I preferred to removed this constraint now.
a later PR solves this naming ambiguity

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
c9d8282e7f enable power and nch to be optional in requests definition
since empty attributes have been removed from the requests with
previous changes, some aditional tests are needed to continue supporting
optional power and nch. (previous objects were created with null
or default values)
user may enter requests without specifying these fields, in this
case the object 'output-power' or 'max-nb-of-channel' do not exist.
the try /except form handles the corresponding exception and
default values are kept else.

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:42 +01:00
EstherLerouzic
e7084a2c29 correction to support empty vectors of constraints
change the naming according to model update +
previous changes authorize empty list of nodes.
This new test supports this case

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
d79d2e0724 change the way transponder and mode are returned in the response
previous way used a wrong interpretation of hop-type. in ietf model,
hop-type is reserved for LOOSE or STRICT constraint description.
Instead, transponder info , according to ietf usual way, should be
included aas a new path-route-object type. such object is not yet
defined in IETF so this is a GNPY proposal to use a 'transponder' object
with type and mode attributes.
this is what has been ecncoded in reques.py for requests and for answers

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
402155c225 change the 'no path' case answer
in case no path could be computed, the answer is changed according to
ietf path-computation model to a simple 'no-path' object

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
e5ec669419 applying changes in the request dict creation
+ removing (currently) unused direction attribute.
This will be added again in a later PR when it will be used

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
8f424e8c9d correction of json generation: removing unused objects
- removing empty objects (if no content , removes the object),
    especially
  - removing empty synchronization vector if no disjunction
    constraint exists
  - removing optional power and nb of channel fields, if user
    did not specify any
when reading, first try if the object exists : try:/except to catch
this case
Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
EstherLerouzic
fea2b84bb9 correction of the json generation for path computation
- changes concerning names due to ietf path-computation draft changes:
        - 'unnumbered-hop' changed to 'num-unnum-hop'
        - suppression of 'label-hop'
        - 'link-diverse' and 'node-diverse' changed to 'disjointness'
      - correction because of previously wrong interpretation of the model:
        - detailed succession on include nodes moved from
          'optimisation /explicit-route-include-objects'
          to 'route-object-include-exclude'
          in 'explicit-route-objects' attribute
      - correction of keywords
        - n and m replaced by N and M
        - strict and loose replaces by STRICT and LOOSE strings
        - change the name of respponse from 'path' to 'response'
    example service file  corrected accordingly

Signed-off-by: EstherLerouzic <esther.lerouzic@orange.com>
2019-10-09 15:38:41 +01:00
Jan Kundrát
0c918940c4 Merge pull request #313 from jktjkt/codacy-assert
codacy: do not complain about asserts in the test suite
2019-10-06 21:49:22 +00:00
Jan Kundrát
a63a6ac0ec codacy: do not complain about asserts in the test suite
Writing an explicit comparison followed by argument printing *and*
handling the final error is just backwards. Let the tool be quiet, this
nonsense does not really add any value.
2019-10-06 23:12:09 +02:00
Jan Kundrát
9f58b914d2 Merge pull request #308 from jktjkt/osnr-bw-printing
Rework OSNR printing
2019-10-06 19:17:46 +00:00
Jan Kundrát
029bac4b03 utils: more descriptive name for itufl
This might have nothing to do with the ITU frequency grid (it's really
just about a uniform distribution), so let's give it a more readable
name and more readable parameters.
2019-10-06 20:43:50 +02:00
Jan Kundrát
a27ad57220 Do not use confusing default values
This looks like the root cause of bug #243, the default values suggested
that this function works in THz.

These defaults are not used anywhere, so let's get rid of something
which adds no value and actively confuses people.
2019-10-06 20:38:31 +02:00
Jan Kundrát
8d31d924f2 Remove unused function 2019-10-06 20:37:24 +02:00
Jan Kundrát
8c3b514f90 Only print out "pretty summary" when not doing the power sweep 2019-10-01 15:24:40 +02:00
Jan Kundrát
3df27fe315 Print both ASE-OSNR and final GSNR for both 0.1nm and signal-bw in power sweep
As agreed upon during today's coders call, this is something that is
needed by actual users in the field, so let's prioritize their needs
over clarity of output for demo purposes.
2019-10-01 15:21:34 +02:00
Jan Kundrát
a6087ce354 Highlight OSNR @ 0.1nm
Esther and Jonas pointed out that it is much more common to work with
OSNR measurements per 0.1nm rather than per signal bandwidth. This is in
line with what OSAs usually report and what transponder datasheets
specify.

cc #307
2019-09-27 09:22:08 +02:00
Jan Kundrát
aae0382523 docs: show a DOI
As per Alessio's request, this adds a DOI and makes this software
citeable. DOIs are assigned automatically via Zenodo in a process that's
integrated with GitHub.
2019-09-23 18:11:44 +01:00
148 changed files with 19016 additions and 12615 deletions

2
.bandit Normal file
View File

@@ -0,0 +1,2 @@
[bandit]
skips: B101

View File

@@ -1,3 +1,3 @@
#!/bin/bash
cp -nr /oopt-gnpy/examples /shared
cp -nr /opt/application/oopt-gnpy/gnpy/example-data /shared
exec "$@"

View File

@@ -15,7 +15,7 @@ if [[ $ALREADY_FOUND == 0 ]]; then
# shared directory setup: do not clobber the real data
mkdir trash
cd trash
docker run -it --rm --volume $(pwd):/shared ${IMAGE_NAME} ./transmission_main_example.py
docker run -it --rm --volume $(pwd):/shared ${IMAGE_NAME} gnpy-transmission-example
else
echo "Image ${IMAGE_NAME}:${IMAGE_TAG} already available, will just update the other tags"
fi

1
.dockerignore Normal file
View File

@@ -0,0 +1 @@
venv/

3
.gitignore vendored
View File

@@ -3,6 +3,7 @@ __pycache__/
*.py[cod]
*$py.class
.ipynb_checkpoints
.idea
# C extensions
*.so
@@ -64,3 +65,5 @@ target/
# MacOS DS_store
.DS_Store
venv/

5
.gitreview Normal file
View File

@@ -0,0 +1,5 @@
[gerrit]
host=review.gerrithub.io
project=Telecominfraproject/oopt-gnpy
defaultrebase=0
defaultbranch=develop

View File

@@ -7,14 +7,11 @@ python:
- "3.7"
install: skip
script:
- python setup.py install
- python setup.py develop
- pip install pytest-cov rstcheck
- pytest --cov-report=xml --cov=gnpy
- rstcheck --ignore-roles cite --ignore-directives automodule --recursive --ignore-messages '(Duplicate explicit target name.*)' .
- ./examples/transmission_main_example.py
- ./examples/path_requests_run.py
- ./examples/transmission_main_example.py examples/raman_edfa_example_network.json --sim examples/sim_params.json --show-channels
- sphinx-build docs/ x-throwaway-location
- pytest --cov-report=xml --cov=gnpy -v
- rstcheck --ignore-roles cite *.rst
- sphinx-build -W --keep-going docs/ x-throwaway-location
after_success:
- bash <(curl -s https://codecov.io/bash)
jobs:

44
.zuul.yaml Normal file
View File

@@ -0,0 +1,44 @@
---
- project:
check:
jobs:
- tox-py36-cover
- coverage-diff:
voting: false
dependencies:
- tox-py36-cover-previous
- tox-py36-cover
vars:
coverage_job_name_previous: tox-py36-cover-previous
coverage_job_name_current: tox-py36-cover
- tox-linters-diff:
voting: false
- tox-docs-el8
- tox-py36-cover-previous
gate:
jobs:
- tox-py36-el8
- tox-docs-el8
tag:
jobs:
- oopt-release-python:
secrets:
- secret: pypi-oopt-gnpy
name: pypi_info
pass-to-parent: true
- secret:
name: pypi-oopt-gnpy
data:
username: __token__
password: !encrypted/pkcs1-oaep
- Taod9JmSMtVAvC5ShSbB3UWuccktQvutdySrj0G7a1Nk4tKFQIdwDXEnBuLpHsZVvsU9Q
6uk4wRVQABDSdNNI/+M/1FwmZfoxuOXa02U5S1deuxW/rBHTxzYcuB8xriwhArBvTiDMk
zyWHVysgDsjlR+85h/DkEhvsaMRDLYWqFwYgXizMoGNKVkwDVIH+qkhBmbggQfDpcYPKT
1gq0d6fw0eKVJtO8+vonMEcE0sWZvHmZvSSu0H++gxoe1W/JtzbCteH3Ak0zktwBHI8Qt
WBqFvY3laad335tpkFJN5b949N+DP8svCWwRwXmkZlHplPYZWF6QpYbEEXL/6Q0H6VwL+
om4f7ybYpKe9Gl939uv2INnXaKe5EU6CMsSw40r2XZCjnSTjWOTgh9pUn2PsoHnqUlALW
VR4Z+ipnCrEbu8aTmX3ROcnwYNS7OXkq4uhwDU1u9QjzyMHet6NQQhwhGtimsTo9KhL4E
TEUNiRlbAgow9WOwM5r3vRzddO8T2HZZSGaWj75qNRX46XPQWRWgB7ItAwyXgwLZ8UzWl
HdztjS3D7Hlsqno3zxNOVlhA5/vl9uVnhFbJnMtUOJAB07YoTJOeR+LjQ0avx/VzopxXc
RA/WvJXVZSBrlAHY0+ip4wPZvdi4Ph90gpmvHJvoH82KVfp2j5jxzUhsage94I=

View File

@@ -7,7 +7,7 @@ To learn how to contribute, please see CONTRIBUTING.md
- Alessio Ferrari (Politecnico di Torino) <alessio.ferrari@polito.it>
- Anders Lindgren (Telia Company) <Anders.X.Lindgren@teliacompany.com>
- Andrea d'Amico (Politecnico di Torino) <andrea.damico@polito.it>
- Andrea D'Amico (Politecnico di Torino) <andrea.damico@polito.it>
- Brian Taylor (Facebook) <briantaylor@fb.com>
- David Boertjes (Ciena) <dboertje@ciena.com>
- Diego Landa (Facebook) <dlanda@fb.com>

View File

@@ -1,7 +1,18 @@
FROM python:3.7-slim
COPY . /oopt-gnpy
WORKDIR /oopt-gnpy
RUN python setup.py install
WORKDIR /shared/examples
ENTRYPOINT ["/oopt-gnpy/.docker-entry.sh"]
WORKDIR /opt/application/oopt-gnpy
RUN mkdir -p /shared/example-data \
&& groupadd gnpy \
&& useradd -u 1000 -g gnpy -m gnpy \
&& apt-get update \
&& apt-get install git -y \
&& rm -rf /var/lib/apt/lists/*
COPY . /opt/application/oopt-gnpy
WORKDIR /opt/application/oopt-gnpy
RUN mkdir topology \
&& mkdir equipment \
&& mkdir autodesign \
&& pip install . \
&& chown -Rc gnpy:gnpy /opt/application/oopt-gnpy /shared/example-data
USER gnpy
ENTRYPOINT ["/opt/application/oopt-gnpy/.docker-entry.sh"]
CMD ["/bin/bash"]

View File

@@ -7,7 +7,7 @@
`gnpy`: mesh optical network route planning and optimization library
====================================================================
|docs| |build|
|docs| |travis| |doi| |contributors| |codacy-quality| |codecov|
**`gnpy` is an open-source, community-developed library for building route
planning and optimization tools in real-world mesh optical networks.**
@@ -27,112 +27,14 @@ Documentation: https://gnpy.readthedocs.io
Get In Touch
~~~~~~~~~~~~
There are `weekly calls <https://telecominfraproject.workplace.com/events/458339931322799/>`__ about our progress.
There are `weekly calls <https://telecominfraproject.workplace.com/events/702894886867547/>`__ about our progress.
Newcomers, users and telecom operators are especially welcome there.
We encourage all interested people outside the TIP to `join the project <https://telecominfraproject.com/apply-for-membership/>`__.
Branches and Tagged Releases
----------------------------
- all releases are `available via GitHub <https://github.com/Telecominfraproject/oopt-gnpy/releases>`_
- the `master <https://github.com/Telecominfraproject/oopt-gnpy/tree/master>`_ branch contains stable, `validated code <https://github.com/Telecominfraproject/oopt-gnpy/wiki/Testing-for-Quality>`_. It is updated from develop on a release schedule determined by the OOPT-PSE Working Group.
- the `develop <https://github.com/Telecominfraproject/oopt-gnpy/tree/develop>`_ branch contains the latest code under active development, which may not be fully validated and tested.
How to Install
--------------
Using prebuilt Docker images
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Our `Docker images <https://hub.docker.com/r/telecominfraproject/oopt-gnpy>`_ contain everything needed to run all examples from this guide.
Docker transparently fetches the image over the network upon first use.
On Linux and Mac, run:
.. code-block:: shell-session
$ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy
root@bea050f186f7:/shared/examples#
On Windows, launch from Powershell as:
.. code-block:: powershell
PS C:\> docker run -it --rm --volume ${PWD}:/shared telecominfraproject/oopt-gnpy
root@89784e577d44:/shared/examples#
In both cases, a directory named ``examples/`` will appear in your current working directory.
GNPy automaticallly populates it with example files from the current release.
Remove that directory if you want to start from scratch.
Using Python on your computer
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
**Note**: `gnpy` supports Python 3 only. Python 2 is not supported.
`gnpy` requires Python ≥3.6
**Note**: the `gnpy` maintainers strongly recommend the use of Anaconda for
managing dependencies.
It is recommended that you use a "virtual environment" when installing `gnpy`.
Do not install `gnpy` on your system Python.
We recommend the use of the `Anaconda Python distribution <https://www.anaconda.com/download>`_ which comes with many scientific computing
dependencies pre-installed. Anaconda creates a base "virtual environment" for
you automatically. You can also create and manage your ``conda`` "virtual
environments" yourself (see:
https://conda.io/docs/user-guide/tasks/manage-environments.html)
To activate your Anaconda virtual environment, you may need to do the
following:
.. code-block:: shell
$ source /path/to/anaconda/bin/activate # activate Anaconda base environment
(base) $ # note the change to the prompt
You can check which Anaconda environment you are using with:
.. code-block:: shell
(base) $ conda env list # list all environments
# conda environments:
#
base * /src/install/anaconda3
(base) $ echo $CONDA_DEFAULT_ENV # show default environment
base
You can check your version of Python with the following. If you are using
Anaconda's Python 3, you should see similar output as below. Your results may
be slightly different depending on your Anaconda installation path and the
exact version of Python you are using.
.. code-block:: shell
$ which python # check which Python executable is used
/path/to/anaconda/bin/python
$ python -V # check your Python version
Python 3.6.5 :: Anaconda, Inc.
From within your Anaconda Python 3 environment, you can clone the master branch
of the `gnpy` repo and install it with:
.. code-block:: shell
$ git clone https://github.com/Telecominfraproject/oopt-gnpy # clone the repo
$ cd oopt-gnpy
$ python setup.py install # install
To test that `gnpy` was successfully installed, you can run this command. If it
executes without a ``ModuleNotFoundError``, you have successfully installed
`gnpy`.
.. code-block:: shell
$ python -c 'import gnpy' # attempt to import gnpy
$ pytest # run tests
Install either via `Docker <docs/install.rst#install-docker>`__, or as a `Python package <docs/install.rst#install-pip>`__.
Instructions for First Use
--------------------------
@@ -157,25 +59,24 @@ This example demonstrates how GNPy can be used to check the expected SNR at the
:target: https://asciinema.org/a/252295
By default, this script operates on a single span network defined in
`examples/edfa_example_network.json <examples/edfa_example_network.json>`_
`gnpy/example-data/edfa_example_network.json <gnpy/example-data/edfa_example_network.json>`_
You can specify a different network at the command line as follows. For
example, to use the CORONET Global network defined in
`examples/CORONET_Global_Topology.json <examples/CORONET_Global_Topology.json>`_:
`gnpy/example-data/CORONET_Global_Topology.json <gnpy/example-data/CORONET_Global_Topology.json>`_:
.. code-block:: shell-session
$ ./examples/transmission_main_example.py examples/CORONET_Global_Topology.json
$ gnpy-transmission-example $(gnpy-example-data)/CORONET_Global_Topology.json
It is also possible to use an Excel file input (for example
`examples/CORONET_Global_Topology.xls <examples/CORONET_Global_Topology.xls>`_).
The Excel file will be processed into a JSON file with the same prefix. For
further instructions on how to prepare the Excel input file, see
`Excel_userguide.rst <Excel_userguide.rst>`_.
`gnpy/example-data/CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_).
The Excel file will be processed into a JSON file with the same prefix.
Further details about the Excel data structure are available `in the documentation <docs/excel.rst>`__.
The main transmission example will calculate the average signal OSNR and SNR
across network elements (transceiver, ROADMs, fibers, and amplifiers)
between two transceivers selected by the user. Additional details are provided by doing ``transmission_main_example.py -h``. (By default, for the CORONET Global
between two transceivers selected by the user. Additional details are provided by doing ``gnpy-transmission-example -h``. (By default, for the CORONET Global
network, it will show the transmission of spectral information between Abilene and Albany)
This script calculates the average signal OSNR = |OSNR| and SNR = |SNR|.
@@ -189,352 +90,74 @@ interference noise.
.. |Pase| replace:: P\ :sub:`ase`
.. |Pnli| replace:: P\ :sub:`nli`
Further Instructions for Use (`transmission_main_example.py`, `path_requests_run.py`)
-------------------------------------------------------------------------------------
Further Instructions for Use
----------------------------
Design and transmission parameters are defined in a dedicated json file. By
default, this information is read from `examples/eqpt_config.json
<examples/eqpt_config.json>`_. This file defines the equipment libraries that
can be customized (EDFAs, fibers, and transceivers).
Simulations are driven by a set of `JSON <docs/json.rst>`__ or `XLS <docs/excel.rst>`__ files.
It also defines the simulation parameters (spans, ROADMs, and the spectral
information to transmit.)
The EDFA equipment library is a list of supported amplifiers. New amplifiers
can be added and existing ones removed. Three different noise models are available:
1. ``'type_def': 'variable_gain'`` is a simplified model simulating a 2-coil EDFA with internal, input and output VOAs. The NF vs gain response is calculated accordingly based on the input parameters: ``nf_min``, ``nf_max``, and ``gain_flatmax``. It is not a simple interpolation but a 2-stage NF calculation.
2. ``'type_def': 'fixed_gain'`` is a fixed gain model. `NF == Cte == nf0` if `gain_min < gain < gain_flatmax`
3. ``'type_def': None`` is an advanced model. A detailed JSON configuration file is required (by default `examples/std_medium_gain_advanced_config.json <examples/std_medium_gain_advanced_config.json>`_). It uses a 3rd order polynomial where NF = f(gain), NF_ripple = f(frequency), gain_ripple = f(frequency), N-array dgt = f(frequency). Compared to the previous models, NF ripple and gain ripple are modelled.
For all amplifier models:
+------------------------+-----------+-----------------------------------------+
| field | type | description |
+========================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the amplifier in the|
| | | JSON/Excel template topology input file |
+------------------------+-----------+-----------------------------------------+
| ``out_voa_auto`` | (boolean) | auto_design feature to optimize the |
| | | amplifier output VOA. If true, output |
| | | VOA is present and will be used to push |
| | | amplifier gain to its maximum, within |
| | | EOL power margins. |
+------------------------+-----------+-----------------------------------------+
| ``allowed_for_design`` | (boolean) | If false, the amplifier will not be |
| | | picked by auto-design but it can still |
| | | be used as a manual input (from JSON or |
| | | Excel template topology files.) |
+------------------------+-----------+-----------------------------------------+
The fiber library currently describes SSMF and NZDF but additional fiber types can be entered by the user following the same model:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the fiber in the |
| | | JSON or Excel template topology input |
| | | file |
+----------------------+-----------+-----------------------------------------+
| ``dispersion`` | (number) | (s.m-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``gamma`` | (number) | 2pi.n2/(lambda*Aeff) (w-2.m-1) |
+----------------------+-----------+-----------------------------------------+
The transceiver equipment library is a list of supported transceivers. New
transceivers can be added and existing ones removed at will by the user. It is
used to determine the service list path feasibility when running the
`path_request_run.py routine <examples/path_request_run.py>`_.
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | A unique name to ID the transceiver in |
| | | the JSON or Excel template topology |
| | | input file |
+----------------------+-----------+-----------------------------------------+
| ``frequency`` | (number) | Min/max as below. |
+----------------------+-----------+-----------------------------------------+
| ``mode`` | (number) | A list of modes supported by the |
| | | transponder. New modes can be added at |
| | | will by the user. The modes are specific|
| | | to each transponder type_variety. |
| | | Each mode is described as below. |
+----------------------+-----------+-----------------------------------------+
The modes are defined as follows:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``format`` | (string) | a unique name to ID the mode |
+----------------------+-----------+-----------------------------------------+
| ``baud_rate`` | (number) | in Hz |
+----------------------+-----------+-----------------------------------------+
| ``OSNR`` | (number) | min required OSNR in 0.1nm (dB) |
+----------------------+-----------+-----------------------------------------+
| ``bit_rate`` | (number) | in bit/s |
+----------------------+-----------+-----------------------------------------+
| ``roll_off`` | (number) | Not used. |
+----------------------+-----------+-----------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-----------------------------------------+
| ``cost`` | (number) | Arbitrary unit |
+----------------------+-----------+-----------------------------------------+
Simulation parameters are defined as follows.
Auto-design automatically creates EDFA amplifier network elements when they are
missing, after a fiber, or between a ROADM and a fiber. This auto-design
functionality can be manually and locally deactivated by introducing a ``Fused``
network element after a ``Fiber`` or a ``Roadm`` that doesn't need amplification.
The amplifier is chosen in the EDFA list of the equipment library based on
gain, power, and NF criteria. Only the EDFA that are marked
``'allowed_for_design': true`` are considered.
For amplifiers defined in the topology JSON input but whose ``gain = 0``
(placeholder), auto-design will set its gain automatically: see ``power_mode`` in
the ``Spans`` library to find out how the gain is calculated.
Span configuration is performed as follows. It is not a list (which may change
in later releases) and the user can only modify the value of existing
parameters:
+-------------------------------------+-----------+---------------------------------------------+
| field | type | description |
+=====================================+===========+=============================================+
| ``power_mode`` | (boolean) | If false, gain mode. Auto-design sets |
| | | amplifier gain = preceding span loss, |
| | | unless the amplifier exists and its |
| | | gain > 0 in the topology input JSON. |
| | | If true, power mode (recommended for |
| | | auto-design and power sweep.) |
| | | Auto-design sets amplifier power |
| | | according to delta_power_range. If the |
| | | amplifier exists with gain > 0 in the |
| | | topology JSON input, then its gain is |
| | | translated into a power target/channel. |
| | | Moreover, when performing a power sweep |
| | | (see ``power_range_db`` in the SI |
| | | configuration library) the power sweep |
| | | is performed w/r/t this power target, |
| | | regardless of preceding amplifiers |
| | | power saturation/limitations. |
+-------------------------------------+-----------+---------------------------------------------+
| ``delta_power_range_db`` | (number) | Auto-design only, power-mode |
| | | only. Specifies the [min, max, step] |
| | | power excursion/span. It is a relative |
| | | power excursion w/r/t the |
| | | power_dbm + power_range_db |
| | | (power sweep if applicable) defined in |
| | | the SI configuration library. This |
| | | relative power excursion is = 1/3 of |
| | | the span loss difference with the |
| | | reference 20 dB span. The 1/3 slope is |
| | | derived from the GN model equations. |
| | | For example, a 23 dB span loss will be |
| | | set to 1 dB more power than a 20 dB |
| | | span loss. The 20 dB reference spans |
| | | will *always* be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all |
| | | spans, use `[0, 0, 0]`. All spans will |
| | | be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all spans |
| | | and 3 dB more power just for the longest |
| | | spans: `[0, 3, 3]`. The longest spans are |
| | | set to |
| | | power = power_dbm + power_range_db + 3. |
| | | To configure a 4 dB power range across |
| | | all spans in 0.5 dB steps: `[-2, 2, 0.5]`. |
| | | A 17 dB span is set to |
| | | power = power_dbm + power_range_db - 1, |
| | | a 20 dB span to |
| | | power = power_dbm + power_range_db and |
| | | a 23 dB span to |
| | | power = power_dbm + power_range_db + 1 |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_fiber_lineic_loss_for_raman`` | (number) | Maximum linear fiber loss for Raman |
| | | amplification use. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_length`` | (number) | Split fiber lengths > max_length. |
| | | Interest to support high level |
| | | topologies that do not specify in line |
| | | amplification sites. For example the |
| | | CORONET_Global_Topology.xls defines |
| | | links > 1000km between 2 sites: it |
| | | couldn't be simulated if these links |
| | | were not split in shorter span lengths. |
+-------------------------------------+-----------+---------------------------------------------+
| ``length_unit`` | "m"/"km" | Unit for ``max_length``. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_loss`` | (number) | Not used in the current code |
| | | implementation. |
+-------------------------------------+-----------+---------------------------------------------+
| ``padding`` | (number) | In dB. Min span loss before putting an |
| | | attenuator before fiber. Attenuator |
| | | value |
| | | Fiber.att_in = max(0, padding - span_loss). |
| | | Padding can be set manually to reach a |
| | | higher padding value for a given fiber |
| | | by filling in the Fiber/params/att_in |
| | | field in the topology json input [1] |
| | | but if span_loss = length * loss_coef |
| | | + att_in + con_in + con_out < padding, |
| | | the specified att_in value will be |
| | | completed to have span_loss = padding. |
| | | Therefore it is not possible to set |
| | | span_loss < padding. |
+-------------------------------------+-----------+---------------------------------------------+
| ``EOL`` | (number) | All fiber span loss ageing. The value |
| | | is added to the con_out (fiber output |
| | | connector). So the design and the path |
| | | feasibility are performed with |
| | | span_loss + EOL. EOL cannot be set |
| | | manually for a given fiber span |
| | | (workaround is to specify higher |
| | | ``con_out`` loss for this fiber). |
+-------------------------------------+-----------+---------------------------------------------+
| ``con_in``, | (number) | Default values if Fiber/params/con_in/out |
| ``con_out`` | | is None in the topology input |
| | | description. This default value is |
| | | ignored if a Fiber/params/con_in/out |
| | | value is input in the topology for a |
| | | given Fiber. |
+-------------------------------------+-----------+---------------------------------------------+
.. code-block:: json
{
"uid": "fiber (A1->A2)",
"type": "Fiber",
"type_variety": "SSMF",
"params":
{
"type_variety": "SSMF",
"length": 120.0,
"loss_coef": 0.2,
"length_units": "km",
"att_in": 0,
"con_in": 0,
"con_out": 0
}
}
ROADMs can be configured as follows. The user can only modify the value of
existing parameters:
+--------------------------+-----------+---------------------------------------------+
| field | type | description |
+==========================+===========+=============================================+
| ``target_pch_out_db`` | (number) | Auto-design sets the ROADM egress channel |
| | | power. This reflects typical control loop |
| | | algorithms that adjust ROADM losses to |
| | | equalize channels (eg coming from different |
| | | ingress direction or add ports) |
| | | This is the default value |
| | | Roadm/params/target_pch_out_db if no value |
| | | is given in the ``Roadm`` element in the |
| | | topology input description. |
| | | This default value is ignored if a |
| | | params/target_pch_out_db value is input in |
| | | the topology for a given ROADM. |
+--------------------------+-----------+---------------------------------------------+
| ``add_drop_osnr`` | (number) | OSNR contribution from the add/drop ports |
+--------------------------+-----------+---------------------------------------------+
| ``restrictions`` | (dict of | If non-empty, keys ``preamp_variety_list`` |
| | strings) | and ``booster_variety_list`` represent |
| | | list of ``type_variety`` amplifiers which |
| | | are allowed for auto-design within ROADM's |
| | | line degrees. |
| | | |
| | | If no booster should be placed on a degree, |
| | | insert a ``Fused`` node on the degree |
| | | output. |
+--------------------------+-----------+---------------------------------------------+
The ``SpectralInformation`` object can be configured as follows. The user can
only modify the value of existing parameters. It defines a spectrum of N
identical carriers. While the code libraries allow for different carriers and
power levels, the current user parametrization only allows one carrier type and
one power/channel definition.
+----------------------+-----------+-------------------------------------------+
| field | type | description |
+======================+===========+===========================================+
| ``f_min``, | (number) | In Hz. Carrier min max excursion. |
| ``f_max`` | | |
+----------------------+-----------+-------------------------------------------+
| ``baud_rate`` | (number) | In Hz. Simulated baud rate. |
+----------------------+-----------+-------------------------------------------+
| ``spacing`` | (number) | In Hz. Carrier spacing. |
+----------------------+-----------+-------------------------------------------+
| ``roll_off`` | (number) | Not used. |
+----------------------+-----------+-------------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-------------------------------------------+
| ``power_dbm`` | (number) | Reference channel power. In gain mode |
| | | (see spans/power_mode = false), all gain |
| | | settings are offset w/r/t this reference |
| | | power. In power mode, it is the |
| | | reference power for |
| | | Spans/delta_power_range_db. For example, |
| | | if delta_power_range_db = `[0,0,0]`, the |
| | | same power=power_dbm is launched in every |
| | | spans. The network design is performed |
| | | with the power_dbm value: even if a |
| | | power sweep is defined (see after) the |
| | | design is not repeated. |
+----------------------+-----------+-------------------------------------------+
| ``power_range_db`` | (number) | Power sweep excursion around power_dbm. |
| | | It is not the min and max channel power |
| | | values! The reference power becomes: |
| | | power_range_db + power_dbm. |
+----------------------+-----------+-------------------------------------------+
| ``sys_margins`` | (number) | In dB. Added margin on min required |
| | | transceiver OSNR. |
+----------------------+-----------+-------------------------------------------+
The `transmission_main_example.py <examples/transmission_main_example.py>`_ script propagates a spectrum of channels at 32 Gbaud, 50 GHz spacing and 0 dBm/channel.
The ``gnpy-transmission-example`` script propagates a spectrum of channels at 32 Gbaud, 50 GHz spacing and 0 dBm/channel.
Launch power can be overridden by using the ``--power`` argument.
Spectrum information is not yet parametrized but can be modified directly in the ``eqpt_config.json`` (via the ``SpectralInformation`` -SI- structure) to accommodate any baud rate or spacing.
The number of channel is computed based on ``spacing`` and ``f_min``, ``f_max`` values.
An experimental support for Raman amplification is available:
.. code-block:: shell
.. code-block:: shell-session
$ ./examples/transmission_main_example.py \
examples/raman_edfa_example_network.json \
--sim examples/sim_params.json --show-channels
$ gnpy-transmission-example \
$(gnpy-example-data)/raman_edfa_example_network.json \
--sim $(gnpy-example-data)/sim_params.json --show-channels
Configuration of Raman pumps (their frequencies, power and pumping direction) is done via the `RamanFiber element in the network topology <examples/raman_edfa_example_network.json>`_.
General numeric parameters for simulaiton control are provided in the `examples/sim_params.json <examples/sim_params.json>`_.
Configuration of Raman pumps (their frequencies, power and pumping direction) is done via the `RamanFiber element in the network topology <gnpy/example-data/raman_edfa_example_network.json>`_.
General numeric parameters for simulaiton control are provided in the `gnpy/example-data/sim_params.json <gnpy/example-data/sim_params.json>`_.
Use `examples/path_requests_run.py <examples/path_requests_run.py>`_ to run multiple optimizations as follows:
Use ``gnpy-path-request`` to request several paths at once:
.. code-block:: shell
.. code-block:: shell-session
$ python path_requests_run.py -h
Usage: path_requests_run.py [-h] [-v] [-o OUTPUT] [network_filename] [service_filename] [eqpt_filename]
$ cd $(gnpy-example-data)
$ gnpy-path-request -o output_file.json \
meshTopologyExampleV2.xls meshTopologyExampleV2_services.json
The ``network_filename`` and ``service_filename`` can be an XLS or JSON file. The ``eqpt_filename`` must be a JSON file.
This program operates on a network topology (`JSON <docs/json.rst>`__ or `Excel <docs/excel.rst>`__ format), processing the list of service requests (JSON or XLS again).
The service requests and reply formats are based on the `draft-ietf-teas-yang-path-computation-01 <https://tools.ietf.org/html/draft-ietf-teas-yang-path-computation-01>`__ with custom extensions (e.g., for transponder modes).
An example of the JSON input is provided in file `service-template.json`, while results are shown in `path_result_template.json`.
To see an example of it, run:
Important note: ``gnpy-path-request`` is not a network dimensionning tool: each service does not reserve spectrum, or occupy ressources such as transponders. It only computes path feasibility assuming the spectrum (between defined frequencies) is loaded with "nb of channels" spaced by "spacing" values as specified in the system parameters input in the service file, each cannel having the same characteristics in terms of baudrate, format,... as the service transponder. The transceiver element acts as a "logical starting/stopping point" for the spectral information propagation. At that point it is not meant to represent the capacity of add drop ports.
As a result transponder type is not part of the network info. it is related to the list of services requests.
.. code-block:: shell
The current version includes a spectrum assigment features that enables to compute a candidate spectrum assignment for each service based on a first fit policy. Spectrum is assigned based on service specified spacing value, path_bandwidth value and selected mode for the transceiver. This spectrum assignment includes a basic capacity planning capability so that the spectrum resource is limited by the frequency min and max values defined for the links. If the requested services reach the link spectrum capacity, additional services feasibility are computed but marked as blocked due to spectrum reason.
$ cd examples
$ python path_requests_run.py meshTopologyExampleV2.xls meshTopologyExampleV2_services.json eqpt_config.json -o output_file.json
REST API (experimental)
-----------------------
``gnpy`` provides an experimental api for requesting several paths at once. It is based on Flask server.
You can run it through command line or Docker.
This program requires a list of connections to be estimated and the equipment
library. The program computes performances for the list of services (accepts
JSON or Excel format) using the same spectrum propagation modules as
``transmission_main_example.py``. Explanation on the Excel template is provided in
the `Excel_userguide.rst <Excel_userguide.rst#service-sheet>`_. Template for
the JSON format can be found here: `service-template.json
<service-template.json>`_.
.. code-block:: shell-session
$ gnpy-rest
.. code-block:: shell-session
$ docker run -p 8080:8080 -it emmanuelledelfour/gnpy-experimental:candi-1.0 gnpy-rest
When starting the api server will aks for an encryption/decryption key. This key i used to encrypt equipment file when using /api/v1/equipments endpoint.
This key is a Fernet key and can be generated this way:
.. code-block:: python
from cryptography.fernet import Fernet
Fernet.generate_key()
After typing the key, you can detach the container by typing ^P^Q.
After starting the api server, you can launch a request
.. code-block:: shell-session
$ curl -v -X POST -H "Content-Type: application/json" -d @<PATH_TO_JSON_REQUEST_FILE> https://localhost:8080/api/v1/path-computation -k
TODO: api documentation, unit tests, real WSGI server with trusted certificates
Contributing
------------
@@ -591,14 +214,34 @@ working group set out to disrupt the planning landscape by providing an open
source simulation model which can be used freely across multiple vendor
implementations.
.. |docs| image:: https://readthedocs.org/projects/gnpy/badge/?version=develop
:target: http://gnpy.readthedocs.io/en/develop/?badge=develop
.. |docs| image:: https://readthedocs.org/projects/gnpy/badge/?version=master
:target: http://gnpy.readthedocs.io/en/master/?badge=master
:alt: Documentation Status
:scale: 100%
.. |build| image:: https://travis-ci.com/Telecominfraproject/oopt-gnpy.svg?branch=develop
.. |travis| image:: https://travis-ci.com/Telecominfraproject/oopt-gnpy.svg?branch=master
:target: https://travis-ci.com/Telecominfraproject/oopt-gnpy
:alt: Build Status
:alt: Build Status via Travis CI
:scale: 100%
.. |doi| image:: https://zenodo.org/badge/96894149.svg
:target: https://zenodo.org/badge/latestdoi/96894149
:alt: DOI
:scale: 100%
.. |contributors| image:: https://img.shields.io/github/contributors-anon/Telecominfraproject/oopt-gnpy
:target: https://github.com/Telecominfraproject/oopt-gnpy/graphs/contributors
:alt: Code Contributors via GitHub
:scale: 100%
.. |codacy-quality| image:: https://img.shields.io/lgtm/grade/python/github/Telecominfraproject/oopt-gnpy
:target: https://lgtm.com/projects/g/Telecominfraproject/oopt-gnpy/
:alt: Code Quality via LGTM.com
:scale: 100%
.. |codecov| image:: https://img.shields.io/codecov/c/github/Telecominfraproject/oopt-gnpy
:target: https://codecov.io/gh/Telecominfraproject/oopt-gnpy
:alt: Code Coverage via codecov
:scale: 100%
TIP OOPT/PSE & PSE WG Charter

View File

@@ -32,7 +32,9 @@ sys.path.insert(0, os.path.abspath('../'))
# ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.mathjax',
'sphinx.ext.githubpages','sphinxcontrib.bibtex']
'sphinx.ext.githubpages',
'sphinxcontrib.bibtex',
'pbr.sphinxext',]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -51,15 +53,6 @@ project = 'gnpy'
copyright = '2018, Telecom InfraProject - OOPT PSE Group'
author = 'Telecom InfraProject - OOPT PSE Group'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '0.1'
# The full version, including alpha/beta/rc tags.
release = '0.1'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
@@ -87,8 +80,17 @@ todo_include_todos = False
on_rtd = os.environ.get('READTHEDOCS') == 'True'
if on_rtd:
html_theme = 'default'
html_theme_options = {
'logo_only': True,
}
else:
html_theme = 'alabaster'
html_theme_options = {
'logo': 'images/GNPy-logo.png',
'logo_name': False,
}
html_logo = 'images/GNPy-logo.png'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
@@ -99,7 +101,7 @@ else:
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
html_static_path = []
# Custom sidebar templates, must be a dictionary that maps document names
# to template names.
@@ -173,4 +175,9 @@ texinfo_documents = [
'Miscellaneous'),
]
autodoc_default_flags = ['members', 'undoc-members', 'private-members', 'show-inheritance']
autodoc_default_options = {
'members': True,
'undoc-members': True,
'private-members': True,
'show-inheritance': True,
}

View File

@@ -1,8 +1,7 @@
Excel (XLS, XLSX) input files
=============================
How to prepare the Excel input file
-----------------------------------
`examples/transmission_main_example.py <examples/transmission_main_example.py>`_ gives the possibility to use an excel input file instead of a json file. The program then will generate the corresponding json file for you.
``gnpy-transmission-example`` gives the possibility to use an excel input file instead of a json file. The program then will generate the corresponding json file for you.
The file named 'meshTopologyExampleV2.xls' is an example.
@@ -16,6 +15,8 @@ In order to work the excel file MUST contain at least 2 sheets:
- Eqt
- Service
.. _excel-nodes-sheet:
Nodes sheet
-----------
@@ -34,7 +35,7 @@ Each line represents a 'node' (ROADM site or an in line amplifier site ILA or a
- If filled, it can take "ROADM", "FUSED" or "ILA" values. If another string is used, it will be considered as not filled. FUSED means that ingress and egress spans will be fused together.
- *State*, *Country*, *Region* are not mandatory.
"Region" is a holdover from the CORONET topology reference file `CORONET_Global_Topology.xls <examples/CORONET_Global_Topology.xls>`_. CORONET separates its network into geographical regions (Europe, Asia, Continental US.) This information is not used by gnpy.
"Region" is a holdover from the CORONET topology reference file `CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_. CORONET separates its network into geographical regions (Europe, Asia, Continental US.) This information is not used by gnpy.
- *Longitude*, *Latitude* are not mandatory. If filled they should contain numbers.
@@ -44,6 +45,8 @@ Each line represents a 'node' (ROADM site or an in line amplifier site ILA or a
**There MUST NOT be empty line(s) between two nodes lines**
.. _excel-links-sheet:
Links sheet
-----------
@@ -80,11 +83,11 @@ and a fiber span from node3 to node6::
- If filled it MUST contain numbers. If empty it is replaced by a default "80" km value.
- If value is below 150 km, it is considered as a single (bidirectional) fiber span.
- If value is over 150 km the `transmission_main_example.py <examples/transmission_main_example.py>`_ program will automatically suppose that intermediate span description are required and will generate fiber spans elements with "_1","_2", ... trailing strings which are not visible in the json output. The reason for the splitting is that current edfa usually do not support large span loss. The current assumption is that links larger than 150km will require intermediate amplification. This value will be revisited when Raman amplification is added”
- If value is over 150 km the `gnpy-transmission-example`` program will automatically suppose that intermediate span description are required and will generate fiber spans elements with "_1","_2", ... trailing strings which are not visible in the json output. The reason for the splitting is that current edfa usually do not support large span loss. The current assumption is that links larger than 150km will require intermediate amplification. This value will be revisited when Raman amplification is added”
- **Fiber type** is not mandatory.
If filled it must contain types listed in `eqpt_config.json <examples/eqpt_config.json>`_ in "Fiber" list "type_variety".
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Fiber" list "type_variety".
If not filled it takes "SSMF" as default value.
- **Lineic att** is not mandatory.
@@ -113,14 +116,16 @@ and a fiber span from node3 to node6::
(in progress)
.. _excel-equipment-sheet:
Eqpt sheet
----------
Eqt sheet is optional. It lists the amplifiers types and characteristics on each degree of the *Node A* line.
Eqpt sheet must contain twelve columns::
<-- east cable from a to z --> <-- west from z to a -->
Node A ; Node Z ; amp type ; att_in ; amp gain ; tilt ; att_out ; amp type ; att_in ; amp gain ; tilt ; att_out
<-- east cable from a to z --> <-- west from z to a -->
Node A ; Node Z ; amp type ; att_in ; amp gain ; tilt ; att_out ; delta_p ; amp type ; att_in ; amp gain ; tilt ; att_out ; delta_p
If the sheet is present, it MUST have as many lines as egress directions of ROADMs defined in Links Sheet.
@@ -150,11 +155,11 @@ then Eqpt sheet should contain:
C - amp3
In case you already have filled Nodes and Links sheets `create_eqpt_sheet.py <examples/create_eqpt_sheet.py>`_ can be used to automatically create a template for the mandatory entries of the list.
In case you already have filled Nodes and Links sheets `create_eqpt_sheet.py <gnpy/example-data/create_eqpt_sheet.py>`_ can be used to automatically create a template for the mandatory entries of the list.
.. code-block:: shell
$ cd examples
$ cd $(gnpy-example-data)
$ python create_eqpt_sheet.py meshTopologyExampleV2.xls
This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content can be directly copied into the Eqt sheet of the excel file. The user then can fill the values in the rest of the columns.
@@ -167,7 +172,7 @@ This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content ca
- **Node Z** is mandatory. It is the egress direction from the *Node A* site. Multiple Links between the same Node A and NodeZ is not supported.
- **amp type** is not mandatory.
If filled it must contain types listed in `eqpt_config.json <examples/eqpt_config.json>`_ in "Edfa" list "type_variety".
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Edfa" list "type_variety".
If not filled it takes "std_medium_gain" as default value.
If filled with fused, a fused element with 0.0 dB loss will be placed instead of an amplifier. This might be used to avoid booster amplifier on a ROADM direction.
@@ -175,19 +180,23 @@ This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content ca
If not filled, it will be determined with design rules in the convert.py file.
If filled, it must contain positive numbers.
- *att_in* and *att_out* are not mandatory and are not used yet. They are the value of the attenautor at input and output of amplifier (in dB).
- *att_in* and *att_out* are not mandatory and are not used yet. They are the value of the attenuator at input and output of amplifier (in dB).
If filled they must contain positive numbers.
- *tilt* --TODO--
- **delta_p**, in dBm, is not mandatory. If filled it is used to set the output target power per channel at the output of the amplifier, if power_mode is True. The output power is then set to power_dbm + delta_power.
# to be completed #
(in progress)
.. _excel-service-sheet:
Service sheet
-------------
Service sheet is optional. It lists the services for which path and feasibility must be computed with path_requests_run.py.
Service sheet is optional. It lists the services for which path and feasibility must be computed with ``gnpy-path_request``.
Service sheet must contain 11 columns::
@@ -216,36 +225,4 @@ Service sheet must contain 11 columns::
- path: is the set of ROADM nodes that must be used by the path. It must contain the list of ROADM names that the path must cross. TODO : only ROADM nodes are accepted in this release. Relax this with any type of nodes. If filled it must contain ROADM ids separated by ' | '. Exact names are required.
- is loose? 'no' value means that the list of nodes should be strictly followed, while any other value means that the constraint may be relaxed if the node is not reachable.
- ** path bandwidth** is optional. It is the amount of capacity required between source and destination in Gbit/s. Default value is 0.0 Gbit/s.
path_requests_run.py
------------------------
**Usage**: path_requests_run.py [-h] [-v] [-o OUTPUT]
[network_filename xls or json] [service_filename xls or json] [eqpt_filename json]
.. code-block:: shell
$ cd examples
$ python path_requests_run.py meshTopologyExampleV2.xls service_file.json eqpt_file -o output_file.json
A function that computes performances for a list of services provided in the service file (accepts json or excel format.
if the service <file.xls> is in xls format, path_requests_run.py converts it to a json file <file_services.json> following the Yang model for requesting Path Computation defined in `draft-ietf-teas-yang-path-computation-01.txt <https://www.ietf.org/id/draft-ietf-teas-yang-path-computation-01.pdf>`_. For PSE use, additional fields with trx type and mode have been added to the te-bandwidth field.
A template for the json file can be found here: `service_template.json <service_template.json>`_
If no output file is given, the computation is shown on standard output for demo.
If a file is specified with the optional -o argument, the result of the computation is converted into a json format following the Yang model for requesting Path Computation defined in `draft-ietf-teas-yang-path-computation-01.txt <https://www.ietf.org/id/draft-ietf-teas-yang-path-computation-01.pdf>`_. TODO: verify that this implementation is correct + give feedback to ietf on what is missing for our specific application.
A template for the result of computation json file can be found here: `path_result_template.json <path_result_template.json>`_
Important note: path_requests_run.py is not a network dimensionning tool : each service does not reserve spectrum, or occupy ressources such as transponders. It only computes path feasibility assuming the spectrum (between defined frequencies) is loaded with "nb of channels" spaced by "spacing" values as specified in the system parameters input in the service file, each cannel having the same characteristics in terms of baudrate, format, ... as the service transponder. The transceiver element acts as a "logical starting/stopping point" for the spectral information propagation. At that point it is not meant to represent the capacity of add drop ports
As a result transponder type is not part of the network info. it is related to the list of services requests.
In a next step we plan to provide required features to enable dimensionning : alocation of ressources, counting channels, limitation of the number of channels, ...
(in progress)
- **path bandwidth** is mandatory. It is the amount of capacity required between source and destination in Gbit/s. Value should be positive (non zero). It is used to compute the amount of required spectrum for the service.

13
docs/gnpy-api-core.rst Normal file
View File

@@ -0,0 +1,13 @@
``gnpy.core``
-------------
.. automodule:: gnpy.core
.. automodule:: gnpy.core.ansi_escapes
.. automodule:: gnpy.core.elements
.. automodule:: gnpy.core.equipment
.. automodule:: gnpy.core.exceptions
.. automodule:: gnpy.core.info
.. automodule:: gnpy.core.network
.. automodule:: gnpy.core.parameters
.. automodule:: gnpy.core.science_utils
.. automodule:: gnpy.core.utils

9
docs/gnpy-api-tools.rst Normal file
View File

@@ -0,0 +1,9 @@
``gnpy.tools``
--------------
.. automodule:: gnpy.tools
.. automodule:: gnpy.tools.cli_examples
.. automodule:: gnpy.tools.convert
.. automodule:: gnpy.tools.json_io
.. automodule:: gnpy.tools.plots
.. automodule:: gnpy.tools.service_sheet

View File

@@ -0,0 +1,6 @@
``gnpy.topology``
-----------------
.. automodule:: gnpy.topology
.. automodule:: gnpy.topology.request
.. automodule:: gnpy.topology.spectrum_assignment

14
docs/gnpy-api.rst Normal file
View File

@@ -0,0 +1,14 @@
***************************
API Reference Documentation
***************************
``gnpy`` package
================
.. automodule:: gnpy
.. toctree::
gnpy-api-core
gnpy-api-topology
gnpy-api-tools

BIN
docs/images/GNPy-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

View File

@@ -1,33 +1,18 @@
.. gnpy documentation master file, created by
sphinx-quickstart on Mon Dec 18 14:41:01 2017.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
GNPy: Optical Route Planning Library
=====================================================================
Welcome to gnpy's documentation!
================================
**gnpy is an open-source, community-developed library for building route planning
and optimization tools in real-world mesh optical networks.**
`gnpy <http://github.com/telecominfraproject/gnpy>`_ is:
- a sponsored project of the `OOPT/PSE <http://telecominfraproject.com/project-groups-2/backhaul-projects/open-optical-packet-transport/>`_ working group of the `Telecom Infra Project <http://telecominfraproject.com>`_.
- fully community-driven, fully open source library
- driven by a consortium of operators, vendors, and academic researchers
- intended for rapid development of production-grade route planning tools
- easily extensible to include custom network elements
- performant to the scale of real-world mesh optical networks
Documentation
=============
The following pages are meant to describe specific implementation details and
modeling assumptions behind gnpy.
`GNPy <http://github.com/telecominfraproject/gnpy>`_ is an open-source,
community-developed library for building route planning and optimization tools
in real-world mesh optical networks. It is based on the Gaussian Noise Model.
.. toctree::
:maxdepth: 2
:maxdepth: 4
install
json
excel
model
gnpy-api
Indices and tables
==================
@@ -36,67 +21,3 @@ Indices and tables
* :ref:`modindex`
* :ref:`search`
Contributors in alphabetical order
==================================
+----------+------------+-----------------------+--------------------------------------+
| Name | Surname | Affiliation | Contact |
+==========+============+=======================+======================================+
| Alessio | Ferrari | Politecnico di Torino | alessio.ferrari@polito.it |
+----------+------------+-----------------------+--------------------------------------+
| Anders | Lindgren | Telia Company | Anders.X.Lindgren@teliacompany.com |
+----------+------------+-----------------------+--------------------------------------+
| Andrea | d'Amico | Politecnico di Torino | andrea.damico@polito.it |
+----------+------------+-----------------------+--------------------------------------+
| Brian | Taylor | Facebook | briantaylor@fb.com |
+----------+------------+-----------------------+--------------------------------------+
| David | Boertjes | Ciena | dboertje@ciena.com |
+----------+------------+-----------------------+--------------------------------------+
| Diego | Landa | Facebook | dlanda@fb.com |
+----------+------------+-----------------------+--------------------------------------+
| Esther | Le Rouzic | Orange | esther.lerouzic@orange.com |
+----------+------------+-----------------------+--------------------------------------+
| Gabriele | Galimberti | Cisco | ggalimbe@cisco.com |
+----------+------------+-----------------------+--------------------------------------+
| Gert | Grammel | Juniper Networks | ggrammel@juniper.net |
+----------+------------+-----------------------+--------------------------------------+
| Gilad | Goldfarb | Facebook | giladg@fb.com |
+----------+------------+-----------------------+--------------------------------------+
| James | Powell | Telecom Infra Project | james.powell@telecominfraproject.com |
+----------+------------+-----------------------+--------------------------------------+
| Jan | Kundrát | Telecom Infra Project | jan.kundrat@telecominfraproject.com |
+----------+------------+-----------------------+--------------------------------------+
| Jeanluc | Augé | Orange | jeanluc.auge@orange.com |
+----------+------------+-----------------------+--------------------------------------+
| Jonas | Mårtensson | RISE Research Sweden | jonas.martensson@ri.se |
+----------+------------+-----------------------+--------------------------------------+
| Mattia | Cantono | Politecnico di Torino | mattia.cantono@polito.it |
+----------+------------+-----------------------+--------------------------------------+
| Miguel | Garrich | University Catalunya | miquel.garrich@upct.es |
+----------+------------+-----------------------+--------------------------------------+
| Raj | Nagarajan | Lumentum | raj.nagarajan@lumentum.com |
+----------+------------+-----------------------+--------------------------------------+
| Roberts | Miculens | Lattelecom | roberts.miculens@lattelecom.lv |
+----------+------------+-----------------------+--------------------------------------+
| Shengxiang | Zhu | University of Arizona | szhu@email.arizona.edu |
+----------+------------+-----------------------+--------------------------------------+
| Stefan | Melin | Telia Company | Stefan.Melin@teliacompany.com |
+----------+------------+-----------------------+--------------------------------------+
| Vittorio | Curri | Politecnico di Torino | vittorio.curri@polito.it |
+----------+------------+-----------------------+--------------------------------------+
| Xufeng | Liu | Jabil | xufeng_liu@jabil.com |
+----------+------------+-----------------------+--------------------------------------+
--------------
- Goal is to build an end-to-end simulation environment which defines the
network models of the optical device transfer functions and their parameters.
This environment will provide validation of the optical performance
requirements for the TIP OLS building blocks.
- The model may be approximate or complete depending on the network complexity.
Each model shall be validated against the proposed network scenario.
- The environment must be able to process network models from multiple vendors,
and also allow users to pick any implementation in an open source framework.
- The PSE will influence and benefit from the innovation of the DTC, API, and
OLS working groups.
- The PSE represents a step along the journey towards multi-layer optimization.

111
docs/install.rst Normal file
View File

@@ -0,0 +1,111 @@
Installing GNPy
---------------
There are several methods on how to obtain GNPy.
The easiest option for a non-developer is probably going via our :ref:`Docker images<install-docker>`.
Developers are encouraged to install the :ref:`Python package in the same way as any other Python package<install-pip>`.
Note that this needs a :ref:`working installation of Python<install-python>`, for example :ref:`via Anaconda<install-anaconda>`.
.. _install-docker:
Using prebuilt Docker images
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Our `Docker images <https://hub.docker.com/r/telecominfraproject/oopt-gnpy>`_ contain everything needed to run all examples from this guide.
Docker transparently fetches the image over the network upon first use.
On Linux and Mac, run:
.. code-block:: shell-session
$ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy
root@bea050f186f7:/shared/example-data#
On Windows, launch from Powershell as:
.. code-block:: console
PS C:\> docker run -it --rm --volume ${PWD}:/shared telecominfraproject/oopt-gnpy
root@89784e577d44:/shared/example-data#
In both cases, a directory named ``example-data/`` will appear in your current working directory.
GNPy automaticallly populates it with example files from the current release.
Remove that directory if you want to start from scratch.
.. _install-python:
Using Python on your computer
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
**Note**: `gnpy` supports Python 3 only. Python 2 is not supported.
`gnpy` requires Python ≥3.6
**Note**: the `gnpy` maintainers strongly recommend the use of Anaconda for
managing dependencies.
It is recommended that you use a "virtual environment" when installing `gnpy`.
Do not install `gnpy` on your system Python.
.. _install-anaconda:
We recommend the use of the `Anaconda Python distribution <https://www.anaconda.com/download>`_ which comes with many scientific computing
dependencies pre-installed. Anaconda creates a base "virtual environment" for
you automatically. You can also create and manage your ``conda`` "virtual
environments" yourself (see:
https://conda.io/docs/user-guide/tasks/manage-environments.html)
To activate your Anaconda virtual environment, you may need to do the
following:
.. code-block:: shell-session
$ source /path/to/anaconda/bin/activate # activate Anaconda base environment
(base) $ # note the change to the prompt
You can check which Anaconda environment you are using with:
.. code-block:: shell-session
(base) $ conda env list # list all environments
# conda environments:
#
base * /src/install/anaconda3
(base) $ echo $CONDA_DEFAULT_ENV # show default environment
base
You can check your version of Python with the following. If you are using
Anaconda's Python 3, you should see similar output as below. Your results may
be slightly different depending on your Anaconda installation path and the
exact version of Python you are using.
.. code-block:: shell-session
$ which python # check which Python executable is used
/path/to/anaconda/bin/python
$ python -V # check your Python version
Python 3.6.5 :: Anaconda, Inc.
.. _install-pip:
Installing the Python package
*****************************
From within your Anaconda Python 3 environment, you can clone the master branch
of the `gnpy` repo and install it with:
.. code-block:: shell-session
$ git clone https://github.com/Telecominfraproject/oopt-gnpy # clone the repo
$ cd oopt-gnpy
$ python setup.py develop
To test that `gnpy` was successfully installed, you can run this command. If it
executes without a ``ModuleNotFoundError``, you have successfully installed
`gnpy`.
.. code-block:: shell-session
$ python -c 'import gnpy' # attempt to import gnpy
$ pytest # run tests

339
docs/json.rst Normal file
View File

@@ -0,0 +1,339 @@
JSON Input Files
================
GNPy uses a set of JSON files for modeling the network.
Some data (such as network topology or the service requests) can be also passed via :ref:`XLS files<excel-service-sheet>`.
Equipment Library
-----------------
Design and transmission parameters are defined in a dedicated json file. By
default, this information is read from `gnpy/example-data/eqpt_config.json
<gnpy/example-data/eqpt_config.json>`_. This file defines the equipment libraries that
can be customized (EDFAs, fibers, and transceivers).
It also defines the simulation parameters (spans, ROADMs, and the spectral
information to transmit.)
EDFA
~~~~
The EDFA equipment library is a list of supported amplifiers. New amplifiers
can be added and existing ones removed. Three different noise models are available:
1. ``'type_def': 'variable_gain'`` is a simplified model simulating a 2-coil EDFA with internal, input and output VOAs. The NF vs gain response is calculated accordingly based on the input parameters: ``nf_min``, ``nf_max``, and ``gain_flatmax``. It is not a simple interpolation but a 2-stage NF calculation.
2. ``'type_def': 'fixed_gain'`` is a fixed gain model. `NF == Cte == nf0` if `gain_min < gain < gain_flatmax`
3. ``'type_def': None`` is an advanced model. A detailed JSON configuration file is required (by default `gnpy/example-data/std_medium_gain_advanced_config.json <gnpy/example-data/std_medium_gain_advanced_config.json>`_). It uses a 3rd order polynomial where NF = f(gain), NF_ripple = f(frequency), gain_ripple = f(frequency), N-array dgt = f(frequency). Compared to the previous models, NF ripple and gain ripple are modelled.
For all amplifier models:
+------------------------+-----------+-----------------------------------------+
| field | type | description |
+========================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the amplifier in the|
| | | JSON/Excel template topology input file |
+------------------------+-----------+-----------------------------------------+
| ``out_voa_auto`` | (boolean) | auto_design feature to optimize the |
| | | amplifier output VOA. If true, output |
| | | VOA is present and will be used to push |
| | | amplifier gain to its maximum, within |
| | | EOL power margins. |
+------------------------+-----------+-----------------------------------------+
| ``allowed_for_design`` | (boolean) | If false, the amplifier will not be |
| | | picked by auto-design but it can still |
| | | be used as a manual input (from JSON or |
| | | Excel template topology files.) |
+------------------------+-----------+-----------------------------------------+
Fiber
~~~~~
The fiber library currently describes SSMF and NZDF but additional fiber types can be entered by the user following the same model:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | a unique name to ID the fiber in the |
| | | JSON or Excel template topology input |
| | | file |
+----------------------+-----------+-----------------------------------------+
| ``dispersion`` | (number) | (s.m-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``dispersion_slope`` | (number) | (s.m-1.m-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``gamma`` | (number) | 2pi.n2/(lambda*Aeff) (w-1.m-1) |
+----------------------+-----------+-----------------------------------------+
| ``pmd_coef`` | (number) | Polarization mode dispersion (PMD) |
| | | coefficient. (s.sqrt(m)-1) |
+----------------------+-----------+-----------------------------------------+
Transceiver
~~~~~~~~~~~
The transceiver equipment library is a list of supported transceivers. New
transceivers can be added and existing ones removed at will by the user. It is
used to determine the service list path feasibility when running the
`path_request_run.py routine <gnpy/example-data/path_request_run.py>`_.
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``type_variety`` | (string) | A unique name to ID the transceiver in |
| | | the JSON or Excel template topology |
| | | input file |
+----------------------+-----------+-----------------------------------------+
| ``frequency`` | (number) | Min/max as below. |
+----------------------+-----------+-----------------------------------------+
| ``mode`` | (number) | A list of modes supported by the |
| | | transponder. New modes can be added at |
| | | will by the user. The modes are specific|
| | | to each transponder type_variety. |
| | | Each mode is described as below. |
+----------------------+-----------+-----------------------------------------+
The modes are defined as follows:
+----------------------+-----------+-----------------------------------------+
| field | type | description |
+======================+===========+=========================================+
| ``format`` | (string) | a unique name to ID the mode |
+----------------------+-----------+-----------------------------------------+
| ``baud_rate`` | (number) | in Hz |
+----------------------+-----------+-----------------------------------------+
| ``OSNR`` | (number) | min required OSNR in 0.1nm (dB) |
+----------------------+-----------+-----------------------------------------+
| ``bit_rate`` | (number) | in bit/s |
+----------------------+-----------+-----------------------------------------+
| ``roll_off`` | (number) | Pure number between 0 and 1. TX signal |
| | | roll-off shape. Used by Raman-aware |
| | | simulation code. |
+----------------------+-----------+-----------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-----------------------------------------+
| ``cost`` | (number) | Arbitrary unit |
+----------------------+-----------+-----------------------------------------+
Simulation parameters
~~~~~~~~~~~~~~~~~~~~~
Auto-design automatically creates EDFA amplifier network elements when they are
missing, after a fiber, or between a ROADM and a fiber. This auto-design
functionality can be manually and locally deactivated by introducing a ``Fused``
network element after a ``Fiber`` or a ``Roadm`` that doesn't need amplification.
The amplifier is chosen in the EDFA list of the equipment library based on
gain, power, and NF criteria. Only the EDFA that are marked
``'allowed_for_design': true`` are considered.
For amplifiers defined in the topology JSON input but whose ``gain = 0``
(placeholder), auto-design will set its gain automatically: see ``power_mode`` in
the ``Spans`` library to find out how the gain is calculated.
Span
~~~~
Span configuration is not a list (which may change
in later releases) and the user can only modify the value of existing
parameters:
+-------------------------------------+-----------+---------------------------------------------+
| field | type | description |
+=====================================+===========+=============================================+
| ``power_mode`` | (boolean) | If false, gain mode. Auto-design sets |
| | | amplifier gain = preceding span loss, |
| | | unless the amplifier exists and its |
| | | gain > 0 in the topology input JSON. |
| | | If true, power mode (recommended for |
| | | auto-design and power sweep.) |
| | | Auto-design sets amplifier power |
| | | according to delta_power_range. If the |
| | | amplifier exists with gain > 0 in the |
| | | topology JSON input, then its gain is |
| | | translated into a power target/channel. |
| | | Moreover, when performing a power sweep |
| | | (see ``power_range_db`` in the SI |
| | | configuration library) the power sweep |
| | | is performed w/r/t this power target, |
| | | regardless of preceding amplifiers |
| | | power saturation/limitations. |
+-------------------------------------+-----------+---------------------------------------------+
| ``delta_power_range_db`` | (number) | Auto-design only, power-mode |
| | | only. Specifies the [min, max, step] |
| | | power excursion/span. It is a relative |
| | | power excursion w/r/t the |
| | | power_dbm + power_range_db |
| | | (power sweep if applicable) defined in |
| | | the SI configuration library. This |
| | | relative power excursion is = 1/3 of |
| | | the span loss difference with the |
| | | reference 20 dB span. The 1/3 slope is |
| | | derived from the GN model equations. |
| | | For example, a 23 dB span loss will be |
| | | set to 1 dB more power than a 20 dB |
| | | span loss. The 20 dB reference spans |
| | | will *always* be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all |
| | | spans, use `[0, 0, 0]`. All spans will |
| | | be set to |
| | | power = power_dbm + power_range_db. |
| | | To configure the same power in all spans |
| | | and 3 dB more power just for the longest |
| | | spans: `[0, 3, 3]`. The longest spans are |
| | | set to |
| | | power = power_dbm + power_range_db + 3. |
| | | To configure a 4 dB power range across |
| | | all spans in 0.5 dB steps: `[-2, 2, 0.5]`. |
| | | A 17 dB span is set to |
| | | power = power_dbm + power_range_db - 1, |
| | | a 20 dB span to |
| | | power = power_dbm + power_range_db and |
| | | a 23 dB span to |
| | | power = power_dbm + power_range_db + 1 |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_fiber_lineic_loss_for_raman`` | (number) | Maximum linear fiber loss for Raman |
| | | amplification use. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_length`` | (number) | Split fiber lengths > max_length. |
| | | Interest to support high level |
| | | topologies that do not specify in line |
| | | amplification sites. For example the |
| | | CORONET_Global_Topology.xlsx defines |
| | | links > 1000km between 2 sites: it |
| | | couldn't be simulated if these links |
| | | were not split in shorter span lengths. |
+-------------------------------------+-----------+---------------------------------------------+
| ``length_unit`` | "m"/"km" | Unit for ``max_length``. |
+-------------------------------------+-----------+---------------------------------------------+
| ``max_loss`` | (number) | Not used in the current code |
| | | implementation. |
+-------------------------------------+-----------+---------------------------------------------+
| ``padding`` | (number) | In dB. Min span loss before putting an |
| | | attenuator before fiber. Attenuator |
| | | value |
| | | Fiber.att_in = max(0, padding - span_loss). |
| | | Padding can be set manually to reach a |
| | | higher padding value for a given fiber |
| | | by filling in the Fiber/params/att_in |
| | | field in the topology json input [1] |
| | | but if span_loss = length * loss_coef |
| | | + att_in + con_in + con_out < padding, |
| | | the specified att_in value will be |
| | | completed to have span_loss = padding. |
| | | Therefore it is not possible to set |
| | | span_loss < padding. |
+-------------------------------------+-----------+---------------------------------------------+
| ``EOL`` | (number) | All fiber span loss ageing. The value |
| | | is added to the con_out (fiber output |
| | | connector). So the design and the path |
| | | feasibility are performed with |
| | | span_loss + EOL. EOL cannot be set |
| | | manually for a given fiber span |
| | | (workaround is to specify higher |
| | | ``con_out`` loss for this fiber). |
+-------------------------------------+-----------+---------------------------------------------+
| ``con_in``, | (number) | Default values if Fiber/params/con_in/out |
| ``con_out`` | | is None in the topology input |
| | | description. This default value is |
| | | ignored if a Fiber/params/con_in/out |
| | | value is input in the topology for a |
| | | given Fiber. |
+-------------------------------------+-----------+---------------------------------------------+
.. code-block:: json
{
"uid": "fiber (A1->A2)",
"type": "Fiber",
"type_variety": "SSMF",
"params":
{
"length": 120.0,
"loss_coef": 0.2,
"length_units": "km",
"att_in": 0,
"con_in": 0,
"con_out": 0
}
}
ROADM
~~~~~
The user can only modify the value of existing parameters:
+--------------------------+-----------+---------------------------------------------+
| field | type | description |
+==========================+===========+=============================================+
| ``target_pch_out_db`` | (number) | Auto-design sets the ROADM egress channel |
| | | power. This reflects typical control loop |
| | | algorithms that adjust ROADM losses to |
| | | equalize channels (eg coming from different |
| | | ingress direction or add ports) |
| | | This is the default value |
| | | Roadm/params/target_pch_out_db if no value |
| | | is given in the ``Roadm`` element in the |
| | | topology input description. |
| | | This default value is ignored if a |
| | | params/target_pch_out_db value is input in |
| | | the topology for a given ROADM. |
+--------------------------+-----------+---------------------------------------------+
| ``add_drop_osnr`` | (number) | OSNR contribution from the add/drop ports |
+--------------------------+-----------+---------------------------------------------+
| ``pmd`` | (number) | Polarization mode dispersion (PMD). (s) |
+--------------------------+-----------+---------------------------------------------+
| ``restrictions`` | (dict of | If non-empty, keys ``preamp_variety_list`` |
| | strings) | and ``booster_variety_list`` represent |
| | | list of ``type_variety`` amplifiers which |
| | | are allowed for auto-design within ROADM's |
| | | line degrees. |
| | | |
| | | If no booster should be placed on a degree, |
| | | insert a ``Fused`` node on the degree |
| | | output. |
+--------------------------+-----------+---------------------------------------------+
SpectralInformation
~~~~~~~~~~~~~~~~~~~
The user can only modify the value of existing parameters. It defines a spectrum of N
identical carriers. While the code libraries allow for different carriers and
power levels, the current user parametrization only allows one carrier type and
one power/channel definition.
+----------------------+-----------+-------------------------------------------+
| field | type | description |
+======================+===========+===========================================+
| ``f_min``, | (number) | In Hz. Carrier min max excursion. |
| ``f_max`` | | |
+----------------------+-----------+-------------------------------------------+
| ``baud_rate`` | (number) | In Hz. Simulated baud rate. |
+----------------------+-----------+-------------------------------------------+
| ``spacing`` | (number) | In Hz. Carrier spacing. |
+----------------------+-----------+-------------------------------------------+
| ``roll_off`` | (number) | Pure number between 0 and 1. TX signal |
| | | roll-off shape. Used by Raman-aware |
| | | simulation code. |
+----------------------+-----------+-------------------------------------------+
| ``tx_osnr`` | (number) | In dB. OSNR out from transponder. |
+----------------------+-----------+-------------------------------------------+
| ``power_dbm`` | (number) | Reference channel power. In gain mode |
| | | (see spans/power_mode = false), all gain |
| | | settings are offset w/r/t this reference |
| | | power. In power mode, it is the |
| | | reference power for |
| | | Spans/delta_power_range_db. For example, |
| | | if delta_power_range_db = `[0,0,0]`, the |
| | | same power=power_dbm is launched in every |
| | | spans. The network design is performed |
| | | with the power_dbm value: even if a |
| | | power sweep is defined (see after) the |
| | | design is not repeated. |
+----------------------+-----------+-------------------------------------------+
| ``power_range_db`` | (number) | Power sweep excursion around power_dbm. |
| | | It is not the min and max channel power |
| | | values! The reference power becomes: |
| | | power_range_db + power_dbm. |
+----------------------+-----------+-------------------------------------------+
| ``sys_margins`` | (number) | In dB. Added margin on min required |
| | | transceiver OSNR. |
+----------------------+-----------+-------------------------------------------+

View File

@@ -1,5 +1,5 @@
The QoT estimation in the PSE framework of TIP-OOPT
=======================================================
Physical Model used in GNPy
===========================
QoT-E including ASE noise and NLI accumulation
----------------------------------------------

View File

@@ -1,94 +0,0 @@
gnpy\.core package
==================
Submodules
----------
gnpy\.core\.ansi_escapes module
-------------------------------
.. automodule:: gnpy.core.ansi_escapes
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.convert module
--------------------------
.. automodule:: gnpy.core.convert
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.elements module
---------------------------
.. automodule:: gnpy.core.elements
gnpy\.core\.equipment module
----------------------------
.. automodule:: gnpy.core.equipment
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.exceptions module
-----------------------------
.. automodule:: gnpy.core.exceptions
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.execute module
--------------------------
.. automodule:: gnpy.core.execute
gnpy\.core\.info module
-----------------------
.. automodule:: gnpy.core.info
gnpy\.core\.network module
--------------------------
.. automodule:: gnpy.core.network
gnpy\.core\.node module
-----------------------
.. automodule:: gnpy.core.node
gnpy\.core\.request module
--------------------------
.. automodule:: gnpy.core.request
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.service_sheet module
--------------------------------
.. automodule:: gnpy.core.service_sheet
:members:
:undoc-members:
:show-inheritance:
gnpy\.core\.units module
------------------------
.. automodule:: gnpy.core.units
gnpy\.core\.utils module
------------------------
.. automodule:: gnpy.core.utils
Module contents
---------------
.. automodule:: gnpy.core

View File

@@ -1,14 +0,0 @@
gnpy package
============
Subpackages
-----------
.. toctree::
gnpy.core
Module contents
---------------
.. automodule:: gnpy

View File

@@ -1,7 +0,0 @@
gnpy
====
.. toctree::
:maxdepth: 4
gnpy

View File

View File

@@ -1,369 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
path_requests_run.py
====================
Reads a JSON request file in accordance with the Yang model
for requesting path computation and returns path results in terms
of path and feasibilty.
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from sys import exit
from argparse import ArgumentParser
from pathlib import Path
from collections import namedtuple
from logging import getLogger, basicConfig, CRITICAL, DEBUG, INFO
from json import dumps, loads
from networkx import (draw_networkx_nodes, draw_networkx_edges,
draw_networkx_labels)
from numpy import mean
from gnpy.core.service_sheet import convert_service_sheet, Request_element, Element
from gnpy.core.utils import load_json
from gnpy.core.network import load_network, build_network, save_network
from gnpy.core.equipment import load_equipment, trx_mode_params, automatic_nch, automatic_spacing
from gnpy.core.elements import Transceiver, Roadm, Edfa, Fused, Fiber
from gnpy.core.utils import db2lin, lin2db
from gnpy.core.request import (Path_request, Result_element, compute_constrained_path,
propagate, jsontocsv, Disjunction, compute_path_dsjctn, requests_aggregation,
propagate_and_optimize_mode)
from gnpy.core.exceptions import ConfigurationError, EquipmentConfigError, NetworkTopologyError
import gnpy.core.ansi_escapes as ansi_escapes
from copy import copy, deepcopy
from textwrap import dedent
from math import ceil
#EQPT_LIBRARY_FILENAME = Path(__file__).parent / 'eqpt_config.json'
logger = getLogger(__name__)
parser = ArgumentParser(description = 'A function that computes performances for a list of services provided in a json file or an excel sheet.')
parser.add_argument('network_filename', nargs='?', type = Path, default= Path(__file__).parent / 'meshTopologyExampleV2.xls')
parser.add_argument('service_filename', nargs='?', type = Path, default= Path(__file__).parent / 'meshTopologyExampleV2.xls')
parser.add_argument('eqpt_filename', nargs='?', type = Path, default=Path(__file__).parent / 'eqpt_config.json')
parser.add_argument('-v', '--verbose', action='count', default=0, help='increases verbosity for each occurence')
parser.add_argument('-o', '--output', type = Path)
def requests_from_json(json_data,equipment):
requests_list = []
for req in json_data['path-request']:
# init all params from request
params = {}
params['request_id'] = req['request-id']
params['source'] = req['src-tp-id']
params['destination'] = req['dst-tp-id']
params['trx_type'] = req['path-constraints']['te-bandwidth']['trx_type']
params['trx_mode'] = req['path-constraints']['te-bandwidth']['trx_mode']
params['format'] = params['trx_mode']
nd_list = req['optimizations']['explicit-route-include-objects']
params['nodes_list'] = [n['unnumbered-hop']['node-id'] for n in nd_list]
params['loose_list'] = [n['unnumbered-hop']['hop-type'] for n in nd_list]
params['spacing'] = req['path-constraints']['te-bandwidth']['spacing']
# recover trx physical param (baudrate, ...) from type and mode
# in trx_mode_params optical power is read from equipment['SI']['default'] and
# nb_channel is computed based on min max frequency and spacing
trx_params = trx_mode_params(equipment,params['trx_type'],params['trx_mode'],True)
params.update(trx_params)
# print(trx_params['min_spacing'])
# optical power might be set differently in the request. if it is indicated then the
# params['power'] is updated
if req['path-constraints']['te-bandwidth']['output-power']:
params['power'] = req['path-constraints']['te-bandwidth']['output-power']
# same process for nb-channel
f_min = params['f_min']
f_max_from_si = params['f_max']
if req['path-constraints']['te-bandwidth']['max-nb-of-channel'] is not None :
nch = req['path-constraints']['te-bandwidth']['max-nb-of-channel']
params['nb_channel'] = nch
spacing = params['spacing']
params['f_max'] = f_min + nch*spacing
else :
params['nb_channel'] = automatic_nch(f_min,f_max_from_si,params['spacing'])
consistency_check(params, f_max_from_si)
try :
params['path_bandwidth'] = req['path-constraints']['te-bandwidth']['path_bandwidth']
except KeyError:
pass
requests_list.append(Path_request(**params))
return requests_list
def consistency_check(params, f_max_from_si):
f_min = params['f_min']
f_max = params['f_max']
max_recommanded_nb_channels = automatic_nch(f_min,f_max,
params['spacing'])
if params['baud_rate'] is not None:
#implicitely means that a mode is defined with min_spacing
if params['min_spacing']>params['spacing'] :
msg = f'Request {params["request_id"]} has spacing below transponder {params["trx_type"]}'+\
f' {params["trx_mode"]} min spacing value {params["min_spacing"]*1e-9}GHz.\n'+\
'Computation stopped'
print(msg)
logger.critical(msg)
exit()
if f_max>f_max_from_si:
msg = dedent(f'''
Requested channel number {params["nb_channel"]}, baud rate {params["baud_rate"]} GHz and requested spacing {params["spacing"]*1e-9}GHz
is not consistent with frequency range {f_min*1e-12} THz, {f_max*1e-12} THz, min recommanded spacing {params["min_spacing"]*1e-9}GHz.
max recommanded nb of channels is {max_recommanded_nb_channels}
Computation stopped.''')
logger.critical(msg)
exit()
def disjunctions_from_json(json_data):
disjunctions_list = []
for snc in json_data['synchronization']:
params = {}
params['disjunction_id'] = snc['synchronization-id']
params['relaxable'] = snc['svec']['relaxable']
params['link_diverse'] = snc['svec']['link-diverse']
params['node_diverse'] = snc['svec']['node-diverse']
params['disjunctions_req'] = snc['svec']['request-id-number']
disjunctions_list.append(Disjunction(**params))
return disjunctions_list
def load_requests(filename,eqpt_filename):
if filename.suffix.lower() == '.xls':
logger.info('Automatically converting requests from XLS to JSON')
json_data = convert_service_sheet(filename,eqpt_filename)
else:
with open(filename, encoding='utf-8') as f:
json_data = loads(f.read())
return json_data
def compute_path_with_disjunction(network, equipment, pathreqlist, pathlist):
# use a list but a dictionnary might be helpful to find path bathsed on request_id
# TODO change all these req, dsjct, res lists into dict !
path_res_list = []
for i,pathreq in enumerate(pathreqlist):
# use the power specified in requests but might be different from the one specified for design
# the power is an optional parameter for requests definition
# if optional, use the one defines in eqt_config.json
p_db = lin2db(pathreq.power*1e3)
p_total_db = p_db + lin2db(pathreq.nb_channel)
print(f'request {pathreq.request_id}')
print(f'Computing path from {pathreq.source} to {pathreq.destination}')
print(f'with path constraint: {[pathreq.source]+pathreq.nodes_list}') #adding first node to be clearer on the output
total_path = pathlist[i]
print(f'Computed path (roadms):{[e.uid for e in total_path if isinstance(e, Roadm)]}\n')
# for debug
# print(f'{pathreq.baud_rate} {pathreq.power} {pathreq.spacing} {pathreq.nb_channel}')
if total_path :
if pathreq.baud_rate is not None:
total_path = propagate(total_path,pathreq,equipment)
# for el in total_path: print(el)
temp_snr01nm = round(mean(total_path[-1].snr+lin2db(pathreq.baud_rate/(12.5e9))),2)
if temp_snr01nm < pathreq.OSNR :
msg = f'\tWarning! Request {pathreq.request_id} computed path from {pathreq.source} to {pathreq.destination} does not pass with {pathreq.tsp_mode}\n' +\
f'\tcomputedSNR in 0.1nm = {temp_snr01nm} - required osnr {pathreq.OSNR}\n'
print(msg)
logger.warning(msg)
total_path = []
else:
total_path,mode = propagate_and_optimize_mode(total_path,pathreq,equipment)
# if no baudrate satisfies spacing, no mode is returned and an empty path is returned
# a warning is shown in the propagate_and_optimize_mode
if mode is not None :
# propagate_and_optimize_mode function returns the mode with the highest bitrate
# that passes. if no mode passes, then it returns an empty path
pathreq.baud_rate = mode['baud_rate']
pathreq.tsp_mode = mode['format']
pathreq.format = mode['format']
pathreq.OSNR = mode['OSNR']
pathreq.tx_osnr = mode['tx_osnr']
pathreq.bit_rate = mode['bit_rate']
else :
total_path = []
# we record the last tranceiver object in order to have th whole
# information about spectrum. Important Note: since transceivers
# attached to roadms are actually logical elements to simulate
# performance, several demands having the same destination may use
# the same transponder for the performance simaulation. This is why
# we use deepcopy: to ensure each propagation is recorded and not
# overwritten
path_res_list.append(deepcopy(total_path))
return path_res_list
def correct_route_list(network, pathreqlist):
# prepares the format of route list of nodes to be consistant
# remove wrong names, remove endpoints
# also correct source and destination
anytype = [n.uid for n in network.nodes() if not isinstance(n, Transceiver) and not isinstance(n, Fiber)]
# TODO there is a problem of identification of fibers in case of parallel fibers bitween two adjacent roadms
# so fiber constraint is not supported
transponders = [n.uid for n in network.nodes() if isinstance(n, Transceiver)]
for pathreq in pathreqlist:
for i,n_id in enumerate(pathreq.nodes_list):
# replace possibly wrong name with a formated roadm name
# print(n_id)
if n_id not in anytype :
nodes_suggestion = [uid for uid in anytype \
if n_id.lower() in uid.lower()]
if pathreq.loose_list[i] == 'loose':
if len(nodes_suggestion)>0 :
new_n = nodes_suggestion[0]
print(f'invalid route node specified:\
\n\'{n_id}\', replaced with \'{new_n}\'')
pathreq.nodes_list[i] = new_n
else:
print(f'\x1b[1;33;40m'+f'invalid route node specified \'{n_id}\', could not use it as constraint, skipped!'+'\x1b[0m')
pathreq.nodes_list.remove(n_id)
pathreq.loose_list.pop(i)
else:
msg = f'\x1b[1;33;40m'+f'could not find node : {n_id} in network topology. Strict constraint can not be applied.'+'\x1b[0m'
logger.critical(msg)
raise ValueError(msg)
if pathreq.source not in transponders:
msg = f'\x1b[1;31;40m'+f'Request: {pathreq.request_id}: could not find transponder source : {pathreq.source}.'+'\x1b[0m'
logger.critical(msg)
print(f'{msg}\nComputation stopped.')
exit()
if pathreq.destination not in transponders:
msg = f'\x1b[1;31;40m'+f'Request: {pathreq.request_id}: could not find transponder destination : {pathreq.destination}.'+'\x1b[0m'
logger.critical(msg)
print(f'{msg}\nComputation stopped.')
exit()
# TODO remove endpoints from this list in case they were added by the user in the xls or json files
return pathreqlist
def correct_disjn(disjn):
local_disjn = disjn.copy()
for el in local_disjn:
for d in local_disjn:
if set(el.disjunctions_req) == set(d.disjunctions_req) and\
el.disjunction_id != d.disjunction_id:
local_disjn.remove(d)
return local_disjn
def path_result_json(pathresult):
data = {
'path': [n.json for n in pathresult]
}
return data
if __name__ == '__main__':
args = parser.parse_args()
basicConfig(level={2: DEBUG, 1: INFO, 0: CRITICAL}.get(args.verbose, DEBUG))
logger.info(f'Computing path requests {args.service_filename} into JSON format')
print('\x1b[1;34;40m'+f'Computing path requests {args.service_filename} into JSON format'+ '\x1b[0m')
# for debug
# print( args.eqpt_filename)
try:
data = load_requests(args.service_filename,args.eqpt_filename)
equipment = load_equipment(args.eqpt_filename)
network = load_network(args.network_filename,equipment)
except EquipmentConfigError as e:
print(f'{ansi_escapes.red}Configuration error in the equipment library:{ansi_escapes.reset} {e}')
exit(1)
except NetworkTopologyError as e:
print(f'{ansi_escapes.red}Invalid network definition:{ansi_escapes.reset} {e}')
exit(1)
except ConfigurationError as e:
print(f'{ansi_escapes.red}Configuration error:{ansi_escapes.reset} {e}')
exit(1)
# Build the network once using the default power defined in SI in eqpt config
# TODO power density : db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
# spacing, f_min and f_max
p_db = equipment['SI']['default'].power_dbm
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,\
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
build_network(network, equipment, p_db, p_total_db)
save_network(args.network_filename, network)
rqs = requests_from_json(data, equipment)
# check that request ids are unique. Non unique ids, may
# mess the computation : better to stop the computation
all_ids = [r.request_id for r in rqs]
if len(all_ids) != len(set(all_ids)):
for a in list(set(all_ids)):
all_ids.remove(a)
msg = f'Requests id {all_ids} are not unique'
logger.critical(msg)
exit()
rqs = correct_route_list(network, rqs)
# pths = compute_path(network, equipment, rqs)
dsjn = disjunctions_from_json(data)
print('\x1b[1;34;40m'+f'List of disjunctions'+ '\x1b[0m')
print(dsjn)
# need to warn or correct in case of wrong disjunction form
# disjunction must not be repeated with same or different ids
dsjn = correct_disjn(dsjn)
# Aggregate demands with same exact constraints
print('\x1b[1;34;40m'+f'Aggregating similar requests'+ '\x1b[0m')
rqs,dsjn = requests_aggregation(rqs,dsjn)
# TODO export novel set of aggregated demands in a json file
print('\x1b[1;34;40m'+'The following services have been requested:'+ '\x1b[0m')
print(rqs)
print('\x1b[1;34;40m'+f'Computing all paths with constraints'+ '\x1b[0m')
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
print('\x1b[1;34;40m'+f'Propagating on selected path'+ '\x1b[0m')
propagatedpths = compute_path_with_disjunction(network, equipment, rqs, pths)
print('\x1b[1;34;40m'+f'Result summary'+ '\x1b[0m')
header = ['req id', ' demand',' snr@bandwidth',' snr@0.1nm',' Receiver minOSNR', ' mode', ' Gbit/s' , ' nb of tsp pairs']
data = []
data.append(header)
for i, p in enumerate(propagatedpths):
if p:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', f'{round(mean(p[-1].snr),2)}',\
f'{round(mean(p[-1].snr+lin2db(rqs[i].baud_rate/(12.5e9))),2)}',\
f'{rqs[i].OSNR}', f'{rqs[i].tsp_mode}' , f'{round(rqs[i].path_bandwidth * 1e-9,2)}' , f'{ceil(rqs[i].path_bandwidth / rqs[i].bit_rate) }']
else:
line = [f'{rqs[i].request_id}',f' {rqs[i].source} to {rqs[i].destination} : not feasible ']
data.append(line)
col_width = max(len(word) for row in data for word in row[2:]) # padding
firstcol_width = max(len(row[0]) for row in data ) # padding
secondcol_width = max(len(row[1]) for row in data ) # padding
for row in data:
firstcol = ''.join(row[0].ljust(firstcol_width))
secondcol = ''.join(row[1].ljust(secondcol_width))
remainingcols = ''.join(word.center(col_width,' ') for word in row[2:])
print(f'{firstcol} {secondcol} {remainingcols}')
if args.output :
result = []
# assumes that list of rqs and list of propgatedpths have same order
for i,p in enumerate(propagatedpths):
result.append(Result_element(rqs[i],p))
temp = path_result_json(result)
fnamecsv = f'{str(args.output)[0:len(str(args.output))-len(str(args.output.suffix))]}.csv'
fnamejson = f'{str(args.output)[0:len(str(args.output))-len(str(args.output.suffix))]}.json'
with open(fnamejson, 'w', encoding='utf-8') as f:
f.write(dumps(path_result_json(result), indent=2, ensure_ascii=False))
with open(fnamecsv,"w", encoding='utf-8') as fcsv :
jsontocsv(temp,equipment,fcsv)
print('\x1b[1;34;40m'+f'saving in {args.output} and {fnamecsv}'+ '\x1b[0m')

View File

@@ -1,316 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
transmission_main_example.py
============================
Main example for transmission simulation.
Reads from network JSON (by default, `edfa_example_network.json`)
'''
from gnpy.core.equipment import load_equipment, trx_mode_params
from gnpy.core.utils import db2lin, lin2db, write_csv
from argparse import ArgumentParser
from sys import exit
from pathlib import Path
from json import loads
from collections import Counter
from logging import getLogger, basicConfig, INFO, ERROR, DEBUG
from numpy import linspace, mean, log10
from matplotlib.pyplot import show, axis, figure, title, text
from networkx import (draw_networkx_nodes, draw_networkx_edges,
draw_networkx_labels, dijkstra_path)
from gnpy.core.network import load_network, build_network, save_network, load_sim_params, configure_network
from gnpy.core.elements import Transceiver, Fiber, RamanFiber, Edfa, Roadm
from gnpy.core.info import create_input_spectral_information, SpectralInformation, Channel, Power, Pref
from gnpy.core.request import Path_request, RequestParams, compute_constrained_path, propagate2
from gnpy.core.exceptions import ConfigurationError, EquipmentConfigError, NetworkTopologyError
import gnpy.core.ansi_escapes as ansi_escapes
logger = getLogger(__name__)
def plot_baseline(network):
edges = set(network.edges())
pos = {n: (n.lng, n.lat) for n in network.nodes()}
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
plot = draw_networkx_nodes(network, nodelist=network.nodes(), node_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
axis('off')
show()
def plot_results(network, path, source, destination, infos):
path_edges = set(zip(path[:-1], path[1:]))
edges = set(network.edges()) - path_edges
pos = {n: (n.lng, n.lat) for n in network.nodes()}
nodes = {}
for k, (x, y) in pos.items():
nodes.setdefault((round(x, 1), round(y, 1)), []).append(k)
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
all_nodes = [n for n in network.nodes() if n not in path]
plot = draw_networkx_nodes(network, nodelist=all_nodes, node_color='#ababab', node_size=50, **kwargs)
draw_networkx_nodes(network, nodelist=path, node_color='#ff0000', node_size=55, **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=path_edges, edge_color='#ff0000', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
title(f'Propagating from {source.loc.city} to {destination.loc.city}')
axis('off')
heading = 'Spectral Information\n\n'
textbox = text(0.85, 0.20, heading, fontsize=14, fontname='Ubuntu Mono',
verticalalignment='top', transform=fig.axes[0].transAxes,
bbox={'boxstyle': 'round', 'facecolor': 'wheat', 'alpha': 0.5})
msgs = {(x, y): heading + '\n\n'.join(str(n) for n in ns if n in path)
for (x, y), ns in nodes.items()}
def hover(event):
if event.xdata is None or event.ydata is None:
return
if fig.contains(event):
x, y = round(event.xdata, 1), round(event.ydata, 1)
if (x, y) in msgs:
textbox.set_text(msgs[x, y])
else:
textbox.set_text(heading)
fig.canvas.draw_idle()
fig.canvas.mpl_connect('motion_notify_event', hover)
show()
def main(network, equipment, source, destination, sim_params, req=None):
result_dicts = {}
network_data = [{
'network_name' : str(args.filename),
'source' : source.uid,
'destination' : destination.uid
}]
result_dicts.update({'network': network_data})
design_data = [{
'power_mode' : equipment['Span']['default'].power_mode,
'span_power_range' : equipment['Span']['default'].delta_power_range_db,
'design_pch' : equipment['SI']['default'].power_dbm,
'baud_rate' : equipment['SI']['default'].baud_rate
}]
result_dicts.update({'design': design_data})
simulation_data = []
result_dicts.update({'simulation results': simulation_data})
power_mode = equipment['Span']['default'].power_mode
print('\n'.join([f'Power mode is set to {power_mode}',
f'=> it can be modified in eqpt_config.json - Span']))
pref_ch_db = lin2db(req.power*1e3) #reference channel power / span (SL=20dB)
pref_total_db = pref_ch_db + lin2db(req.nb_channel) #reference total power / span (SL=20dB)
build_network(network, equipment, pref_ch_db, pref_total_db)
path = compute_constrained_path(network, req)
if len([s.length for s in path if isinstance(s, RamanFiber)]):
if sim_params is None:
print(f'{ansi_escapes.red}Invocation error:{ansi_escapes.reset} RamanFiber requires passing simulation params via --sim-params')
exit(1)
configure_network(network, sim_params)
spans = [s.length for s in path if isinstance(s, RamanFiber) or isinstance(s, Fiber)]
print(f'\nThere are {len(spans)} fiber spans over {sum(spans)/1000:.0f} km between {source.uid} and {destination.uid}')
print(f'\nNow propagating between {source.uid} and {destination.uid}:')
try:
p_start, p_stop, p_step = equipment['SI']['default'].power_range_db
p_num = abs(int(round((p_stop - p_start)/p_step))) + 1 if p_step != 0 else 1
power_range = list(linspace(p_start, p_stop, p_num))
except TypeError:
print('invalid power range definition in eqpt_config, should be power_range_db: [lower, upper, step]')
power_range = [0]
if not power_mode:
#power cannot be changed in gain mode
power_range = [0]
for dp_db in power_range:
req.power = db2lin(pref_ch_db + dp_db)*1e-3
if power_mode:
print(f'\nPropagating with input power = {ansi_escapes.cyan}{lin2db(req.power*1e3):.2f} dBm{ansi_escapes.reset}:')
else:
print(f'\nPropagating in {ansi_escapes.cyan}gain mode{ansi_escapes.reset}: power cannot be set manually')
infos = propagate2(path, req, equipment)
if len(power_range) == 1:
for elem in path:
print(elem)
if power_mode:
print(f'\nTransmission result for input power = {lin2db(req.power*1e3):.2f} dBm:')
else:
print(f'\nTransmission results:')
print(f' Final SNR total (signal bw): {ansi_escapes.cyan}{mean(destination.snr):.02f} dB{ansi_escapes.reset}')
#print(f'\n !!!!!!!!!!!!!!!!! TEST POINT !!!!!!!!!!!!!!!!!!!!!')
#print(f'carriers ase output of {path[1]} =\n {list(path[1].carriers("out", "nli"))}')
# => use "in" or "out" parameter
# => use "nli" or "ase" or "signal" or "total" parameter
if power_mode:
simulation_data.append({
'Pch_dBm' : pref_ch_db + dp_db,
'OSNR_ASE_0.1nm' : round(mean(destination.osnr_ase_01nm),2),
'OSNR_ASE_signal_bw' : round(mean(destination.osnr_ase),2),
'SNR_nli_signal_bw' : round(mean(destination.osnr_nli),2),
'SNR_total_signal_bw' : round(mean(destination.snr),2)
})
else:
simulation_data.append({
'gain_mode' : 'power canot be set',
'OSNR_ASE_0.1nm' : round(mean(destination.osnr_ase_01nm),2),
'OSNR_ASE_signal_bw' : round(mean(destination.osnr_ase),2),
'SNR_nli_signal_bw' : round(mean(destination.osnr_nli),2),
'SNR_total_signal_bw' : round(mean(destination.snr),2)
})
write_csv(result_dicts, 'simulation_result.csv')
return path, infos
parser = ArgumentParser()
parser.add_argument('-e', '--equipment', type=Path,
default=Path(__file__).parent / 'eqpt_config.json')
parser.add_argument('--sim-params', type=Path,
default=None, help='Path to the JSON containing simulation parameters (required for Raman)')
parser.add_argument('--show-channels', action='store_true', help='Show final per-channel OSNR summary')
parser.add_argument('-pl', '--plot', action='store_true')
parser.add_argument('-v', '--verbose', action='count', default=0, help='increases verbosity for each occurence')
parser.add_argument('-l', '--list-nodes', action='store_true', help='list all transceiver nodes')
parser.add_argument('-po', '--power', default=0, help='channel ref power in dBm')
parser.add_argument('-names', '--names-matching', action='store_true', help='display network names that are closed matches')
parser.add_argument('filename', nargs='?', type=Path,
default=Path(__file__).parent / 'edfa_example_network.json')
parser.add_argument('source', nargs='?', help='source node')
parser.add_argument('destination', nargs='?', help='destination node')
if __name__ == '__main__':
args = parser.parse_args()
basicConfig(level={0: ERROR, 1: INFO, 2: DEBUG}.get(args.verbose, DEBUG))
try:
equipment = load_equipment(args.equipment)
network = load_network(args.filename, equipment, args.names_matching)
sim_params = load_sim_params(args.sim_params) if args.sim_params is not None else None
except EquipmentConfigError as e:
print(f'{ansi_escapes.red}Configuration error in the equipment library:{ansi_escapes.reset} {e}')
exit(1)
except NetworkTopologyError as e:
print(f'{ansi_escapes.red}Invalid network definition:{ansi_escapes.reset} {e}')
exit(1)
except ConfigurationError as e:
print(f'{ansi_escapes.red}Configuration error:{ansi_escapes.reset} {e}')
exit(1)
if args.plot:
plot_baseline(network)
transceivers = {n.uid: n for n in network.nodes() if isinstance(n, Transceiver)}
if not transceivers:
exit('Network has no transceivers!')
if len(transceivers) < 2:
exit('Network has only one transceiver!')
if args.list_nodes:
for uid in transceivers:
print(uid)
exit()
#First try to find exact match if source/destination provided
if args.source:
source = transceivers.pop(args.source, None)
valid_source = True if source else False
else:
source = None
logger.info('No source node specified: picking random transceiver')
if args.destination:
destination = transceivers.pop(args.destination, None)
valid_destination = True if destination else False
else:
destination = None
logger.info('No destination node specified: picking random transceiver')
#If no exact match try to find partial match
if args.source and not source:
#TODO code a more advanced regex to find nodes match
source = next((transceivers.pop(uid) for uid in transceivers \
if args.source.lower() in uid.lower()), None)
if args.destination and not destination:
#TODO code a more advanced regex to find nodes match
destination = next((transceivers.pop(uid) for uid in transceivers \
if args.destination.lower() in uid.lower()), None)
#If no partial match or no source/destination provided pick random
if not source:
source = list(transceivers.values())[0]
del transceivers[source.uid]
if not destination:
destination = list(transceivers.values())[0]
logger.info(f'source = {args.source!r}')
logger.info(f'destination = {args.destination!r}')
params = {}
params['request_id'] = 0
params['trx_type'] = ''
params['trx_mode'] = ''
params['source'] = source.uid
params['destination'] = destination.uid
params['nodes_list'] = [destination.uid]
params['loose_list'] = ['strict']
params['format'] = ''
params['path_bandwidth'] = 0
trx_params = trx_mode_params(equipment)
if args.power:
trx_params['power'] = db2lin(float(args.power))*1e-3
params.update(trx_params)
req = Path_request(**params)
path, infos = main(network, equipment, source, destination, sim_params, req)
save_network(args.filename, network)
if args.show_channels:
print('\nThe total SNR per channel at the end of the line is:')
print('{:>5}{:>26}{:>26}{:>28}{:>28}{:>28}' \
.format('Ch. #', 'Channel frequency (THz)', 'Channel power (dBm)', 'OSNR ASE (signal bw, dB)', 'SNR NLI (signal bw, dB)', 'SNR total (signal bw, dB)'))
for final_carrier, ch_osnr, ch_snr_nl, ch_snr in zip(infos[path[-1]][1].carriers, path[-1].osnr_ase, path[-1].osnr_nli, path[-1].snr):
ch_freq = final_carrier.frequency * 1e-12
ch_power = lin2db(final_carrier.power.signal*1e3)
print('{:5}{:26.2f}{:26.2f}{:28.2f}{:28.2f}{:28.2f}' \
.format(final_carrier.channel_number, round(ch_freq, 2), round(ch_power, 2), round(ch_osnr, 2), round(ch_snr_nl, 2), round(ch_snr, 2)))
if not args.source:
print(f'\n(No source node specified: picked {source.uid})')
elif not valid_source:
print(f'\n(Invalid source node {args.source!r} replaced with {source.uid})')
if not args.destination:
print(f'\n(No destination node specified: picked {destination.uid})')
elif not valid_destination:
print(f'\n(Invalid destination node {args.destination!r} replaced with {destination.uid})')
if args.plot:
plot_results(network, path, source, destination, infos)

View File

@@ -0,0 +1,8 @@
'''
GNPy is an open-source, community-developed library for building route planning and optimization tools in real-world mesh optical networks. It is based on the Gaussian Noise Model.
Signal propagation is implemented in :py:mod:`.core`.
Path finding and spectrum assignment is in :py:mod:`.topology`.
Various tools and auxiliary code, including the JSON I/O handling, is in
:py:mod:`.tools`.
'''

9
gnpy/api/__init__.py Normal file
View File

@@ -0,0 +1,9 @@
# coding: utf-8
from flask import Flask
app = Flask(__name__)
import gnpy.api.route.path_request_route
import gnpy.api.route.status_route
import gnpy.api.route.topology_route
import gnpy.api.route.equipments_route

View File

@@ -0,0 +1 @@
# coding: utf-8

View File

@@ -0,0 +1,14 @@
# coding: utf-8
class ConfigError(Exception):
""" Exception raise for configuration file error
Attributes:
message -- explanation of the error
"""
def __init__(self, message):
self.message = message
def __str__(self):
return self.message

View File

@@ -0,0 +1,14 @@
# coding: utf-8
class EquipmentError(Exception):
""" Exception raise for equipment error
Attributes:
message -- explanation of the error
"""
def __init__(self, message):
self.message = message
def __str__(self):
return self.message

View File

@@ -0,0 +1,33 @@
# coding: utf-8
import json
import re
import werkzeug
from gnpy.api.model.error import Error
_reaesc = re.compile(r'\x1b[^m]*m')
def common_error_handler(exception):
"""
:type exception: Exception
"""
status_code = 500
if not isinstance(exception, werkzeug.exceptions.HTTPException):
exception = werkzeug.exceptions.InternalServerError()
exception.description = "Something went wrong on our side."
else:
status_code = exception.code
response = Error(message=exception.name, description=exception.description,
code=status_code)
return werkzeug.Response(response=json.dumps(response.__dict__), status=status_code, mimetype='application/json')
def bad_request_handler(exception):
response = Error(message='bad request', description=_reaesc.sub('', str(exception)),
code=400)
return werkzeug.Response(response=json.dumps(response.__dict__), status=400, mimetype='application/json')

View File

@@ -0,0 +1,14 @@
# coding: utf-8
class PathComputationError(Exception):
""" Exception raise for path computation error error
Attributes:
message -- explanation of the error
"""
def __init__(self, message):
self.message = message
def __str__(self):
return self.message

View File

@@ -0,0 +1,14 @@
# coding: utf-8
class TopologyError(Exception):
""" Exception raise for topology error
Attributes:
message -- explanation of the error
"""
def __init__(self, message):
self.message = message
def __str__(self):
return self.message

View File

@@ -0,0 +1 @@
# coding: utf-8

17
gnpy/api/model/error.py Normal file
View File

@@ -0,0 +1,17 @@
# coding: utf-8
class Error:
def __init__(self, code: int = None, message: str = None, description: str = None):
"""Error
:param code: The code of this Error.
:type code: int
:param message: The message of this Error.
:type message: str
:param description: The description of this Error.
:type description: str
"""
self.code = code
self.message = message
self.description = description

8
gnpy/api/model/result.py Normal file
View File

@@ -0,0 +1,8 @@
# coding: utf-8
class Result:
def __init__(self, message: str = None, description: str = None):
self.message = message
self.description = description

83
gnpy/api/rest_example.py Normal file
View File

@@ -0,0 +1,83 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.rest_example
=======================
GNPy as a rest API example
'''
import logging
from logging.handlers import RotatingFileHandler
import werkzeug
from flask_injector import FlaskInjector
from injector import singleton
from werkzeug.exceptions import InternalServerError
import gnpy.core.exceptions as exceptions
from gnpy.api import app
from gnpy.api.exception.exception_handler import bad_request_handler, common_error_handler
from gnpy.api.exception.path_computation_error import PathComputationError
from gnpy.api.exception.topology_error import TopologyError
from gnpy.api.service import config_service
from gnpy.api.service.encryption_service import EncryptionService
from gnpy.api.service.equipment_service import EquipmentService
from gnpy.api.service.path_request_service import PathRequestService
_logger = logging.getLogger(__name__)
def _init_logger():
handler = RotatingFileHandler('api.log', maxBytes=1024 * 1024, backupCount=5, encoding='utf-8')
ch = logging.StreamHandler()
logging.basicConfig(level=logging.INFO, handlers=[handler, ch],
format="%(asctime)s %(levelname)s %(name)s(%(lineno)s) [%(threadName)s - %(thread)d] - %("
"message)s")
def _init_app(key):
app.register_error_handler(KeyError, bad_request_handler)
app.register_error_handler(TypeError, bad_request_handler)
app.register_error_handler(ValueError, bad_request_handler)
app.register_error_handler(exceptions.ConfigurationError, bad_request_handler)
app.register_error_handler(exceptions.DisjunctionError, bad_request_handler)
app.register_error_handler(exceptions.EquipmentConfigError, bad_request_handler)
app.register_error_handler(exceptions.NetworkTopologyError, bad_request_handler)
app.register_error_handler(exceptions.ServiceError, bad_request_handler)
app.register_error_handler(exceptions.SpectrumError, bad_request_handler)
app.register_error_handler(exceptions.ParametersError, bad_request_handler)
app.register_error_handler(AssertionError, bad_request_handler)
app.register_error_handler(InternalServerError, common_error_handler)
app.register_error_handler(TopologyError, bad_request_handler)
app.register_error_handler(PathComputationError, bad_request_handler)
for error_code in werkzeug.exceptions.default_exceptions:
app.register_error_handler(error_code, common_error_handler)
config = config_service.init_config()
config.add_section('SECRET')
config.set('SECRET', 'equipment', key)
app.config['properties'] = config
def _configure(binder):
binder.bind(EquipmentService,
to=EquipmentService(EncryptionService(app.config['properties'].get('SECRET', 'equipment'))),
scope=singleton)
binder.bind(PathRequestService,
to=PathRequestService(EncryptionService(app.config['properties'].get('SECRET', 'equipment'))),
scope=singleton)
app.config['properties'].pop('SECRET', None)
def main():
key = input('Enter encryption/decryption key: ')
_init_logger()
_init_app(key)
FlaskInjector(app=app, modules=[_configure])
app.run(host='0.0.0.0', port=8080, ssl_context='adhoc')
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,2 @@
# coding: utf-8

View File

@@ -0,0 +1,38 @@
# coding: utf-8
import http
import json
from flask import request
from gnpy.api import app
from gnpy.api.exception.equipment_error import EquipmentError
from gnpy.api.model.result import Result
from gnpy.api.service.equipment_service import EquipmentService
EQUIPMENT_BASE_PATH = '/api/v1/equipments'
EQUIPMENT_ID_PATH = EQUIPMENT_BASE_PATH + '/<equipment_id>'
@app.route(EQUIPMENT_BASE_PATH, methods=['POST'])
def create_equipment(equipment_service: EquipmentService):
if not request.is_json:
raise EquipmentError('Request body is not json')
equipment_identifier = equipment_service.save_equipment(request.json)
response = Result(message='Equipment creation ok', description=equipment_identifier)
return json.dumps(response.__dict__), 201, {'location': EQUIPMENT_BASE_PATH + '/' + equipment_identifier}
@app.route(EQUIPMENT_ID_PATH, methods=['PUT'])
def update_equipment(equipment_id, equipment_service: EquipmentService):
if not request.is_json:
raise EquipmentError('Request body is not json')
equipment_identifier = equipment_service.update_equipment(request.json, equipment_id)
response = Result(message='Equipment update ok', description=equipment_identifier)
return json.dumps(response.__dict__), http.HTTPStatus.OK, {
'location': EQUIPMENT_BASE_PATH + '/' + equipment_identifier}
@app.route(EQUIPMENT_ID_PATH, methods=['DELETE'])
def delete_equipment(equipment_id, equipment_service: EquipmentService):
equipment_service.delete_equipment(equipment_id)
return '', http.HTTPStatus.NO_CONTENT

View File

@@ -0,0 +1,63 @@
# coding: utf-8
import http
import os
from pathlib import Path
from flask import request
from gnpy.api import app
from gnpy.api.exception.equipment_error import EquipmentError
from gnpy.api.exception.topology_error import TopologyError
from gnpy.api.service import topology_service
from gnpy.api.service.equipment_service import EquipmentService
from gnpy.api.service.path_request_service import PathRequestService
from gnpy.tools.json_io import _equipment_from_json, network_from_json
from gnpy.topology.request import ResultElement
PATH_COMPUTATION_BASE_PATH = '/api/v1/path-computation'
AUTODESIGN_PATH = PATH_COMPUTATION_BASE_PATH + '/<path_computation_id>/autodesign'
_examples_dir = Path(__file__).parent.parent.parent / 'example-data'
@app.route(PATH_COMPUTATION_BASE_PATH, methods=['POST'])
def compute_path(equipment_service: EquipmentService, path_request_service: PathRequestService):
data = request.json
service = data['gnpy-api:service']
if 'gnpy-api:topology' in data:
topology = data['gnpy-api:topology']
elif 'gnpy-api:topology_id' in data:
topology = topology_service.get_topology(data['gnpy-api:topology_id'])
else:
raise TopologyError('No topology found in request')
if 'gnpy-api:equipment' in data:
equipment = data['gnpy-api:equipment']
elif 'gnpy-api:equipment_id' in data:
equipment = equipment_service.get_equipment(data['gnpy-api:equipment_id'])
else:
raise EquipmentError('No equipment found in request')
equipment = _equipment_from_json(equipment,
os.path.join(_examples_dir, 'std_medium_gain_advanced_config.json'))
network = network_from_json(topology, equipment)
propagatedpths, reversed_propagatedpths, rqs, path_computation_id = path_request_service.path_requests_run(service,
network,
equipment)
# Generate the output
result = []
# assumes that list of rqs and list of propgatedpths have same order
for i, pth in enumerate(propagatedpths):
result.append(ResultElement(rqs[i], pth, reversed_propagatedpths[i]))
return {"result": {"response": [n.json for n in result]}}, 201, {
'location': AUTODESIGN_PATH.replace('<path_computation_id>', path_computation_id)}
@app.route(AUTODESIGN_PATH, methods=['GET'])
def get_autodesign(path_computation_id, path_request_service: PathRequestService):
return path_request_service.get_autodesign(path_computation_id), http.HTTPStatus.OK
@app.route(AUTODESIGN_PATH, methods=['DELETE'])
def delete_autodesign(path_computation_id, path_request_service: PathRequestService):
path_request_service.delete_autodesign(path_computation_id)
return '', http.HTTPStatus.NO_CONTENT

View File

@@ -0,0 +1,7 @@
# coding: utf-8
from gnpy.api import app
@app.route('/api/v1/status', methods=['GET'])
def api_status():
return {"version": "v1", "status": "ok"}, 200

View File

@@ -0,0 +1,43 @@
# coding: utf-8
import http
import json
from flask import request
from gnpy.api import app
from gnpy.api.exception.topology_error import TopologyError
from gnpy.api.model.result import Result
from gnpy.api.service import topology_service
TOPOLOGY_BASE_PATH = '/api/v1/topologies'
TOPOLOGY_ID_PATH = TOPOLOGY_BASE_PATH + '/<topology_id>'
@app.route(TOPOLOGY_BASE_PATH, methods=['POST'])
def create_topology():
if not request.is_json:
raise TopologyError('Request body is not json')
topology_identifier = topology_service.save_topology(request.json)
response = Result(message='Topology creation ok', description=topology_identifier)
return json.dumps(response.__dict__), 201, {'location': TOPOLOGY_BASE_PATH + '/' + topology_identifier}
@app.route(TOPOLOGY_ID_PATH, methods=['PUT'])
def update_topology(topology_id):
if not request.is_json:
raise TopologyError('Request body is not json')
topology_identifier = topology_service.update_topology(request.json, topology_id)
response = Result(message='Topology update ok', description=topology_identifier)
return json.dumps(response.__dict__), http.HTTPStatus.OK, {
'location': TOPOLOGY_BASE_PATH + '/' + topology_identifier}
@app.route(TOPOLOGY_ID_PATH, methods=['GET'])
def get_topology(topology_id):
return topology_service.get_topology(topology_id), http.HTTPStatus.OK
@app.route(TOPOLOGY_ID_PATH, methods=['DELETE'])
def delete_topology(topology_id):
topology_service.delete_topology(topology_id)
return '', http.HTTPStatus.NO_CONTENT

View File

@@ -0,0 +1 @@
# coding: utf-8

View File

@@ -0,0 +1,45 @@
# coding: utf-8
import configparser
import os
from flask import current_app
from gnpy.api.exception.config_error import ConfigError
def init_config(properties_file_path: str = os.path.join(os.path.dirname(__file__),
'properties.ini')) -> configparser.ConfigParser:
"""
Read config from properties_file_path
@param properties_file_path: the properties file to read
@return: config parser
"""
if not os.path.exists(properties_file_path):
raise ConfigError('Properties file does not exist ' + properties_file_path)
config = configparser.ConfigParser()
config.read(properties_file_path)
return config
def get_topology_dir() -> str:
"""
Get the base dir where topologies are saved
@return: the directory of topologies
"""
return current_app.config['properties'].get('DIRECTORY', 'topology')
def get_equipment_dir() -> str:
"""
Get the base dir where equipments are saved
@return: the directory of equipments
"""
return current_app.config['properties'].get('DIRECTORY', 'equipment')
def get_autodesign_dir() -> str:
"""
Get the base dir where autodesign are saved
@return: the directory of equipments
"""
return current_app.config['properties'].get('DIRECTORY', 'autodesign')

View File

@@ -0,0 +1,13 @@
# coding: utf-8
from cryptography.fernet import Fernet
class EncryptionService:
def __init__(self, key):
self._fernet = Fernet(key)
def encrypt(self, data):
return self._fernet.encrypt(data)
def decrypt(self, data):
return self._fernet.decrypt(data)

View File

@@ -0,0 +1,66 @@
# coding: utf-
import json
import os
import uuid
from injector import Inject
from gnpy.api.exception.equipment_error import EquipmentError
from gnpy.api.service import config_service
from gnpy.api.service.encryption_service import EncryptionService
class EquipmentService:
def __init__(self, encryption_service: EncryptionService):
self.encryption = encryption_service
def save_equipment(self, equipment):
"""
Save equipment to file.
@param equipment: json content
@return: a UUID identifier to identify the equipment
"""
equipment_identifier = str(uuid.uuid4())
# TODO: validate json content
self._write_equipment(equipment, equipment_identifier)
return equipment_identifier
def update_equipment(self, equipment, equipment_identifier):
"""
Update equipment with identifier equipment_identifier.
@param equipment_identifier: the identifier of the equipment to be updated
@param equipment: json content
@return: a UUID identifier to identify the equipment
"""
# TODO: validate json content
self._write_equipment(equipment, equipment_identifier)
return equipment_identifier
def _write_equipment(self, equipment, equipment_identifier):
equipment_dir = config_service.get_equipment_dir()
with(open(os.path.join(equipment_dir, '.'.join([equipment_identifier, 'json'])), 'wb')) as file:
file.write(self.encryption.encrypt(json.dumps(equipment).encode()))
def get_equipment(self, equipment_id: str) -> dict:
"""
Get the equipment with id equipment_id
@param equipment_id:
@return: the equipment in json format
"""
equipment_dir = config_service.get_equipment_dir()
equipment_file = os.path.join(equipment_dir, '.'.join([equipment_id, 'json']))
if not os.path.exists(equipment_file):
raise EquipmentError('Equipment with id {} does not exist '.format(equipment_id))
with(open(equipment_file, 'rb')) as file:
return json.loads(self.encryption.decrypt(file.read()))
def delete_equipment(self, equipment_id: str):
"""
Delete equipment with id equipment_id
@param equipment_id:
"""
equipment_dir = config_service.get_equipment_dir()
equipment_file = os.path.join(equipment_dir, '.'.join([equipment_id, 'json']))
if os.path.exists(equipment_file):
os.remove(equipment_file)

View File

@@ -0,0 +1,100 @@
# -*- coding: utf-8 -*-
import json
import logging
import os
import uuid
import gnpy.core.ansi_escapes as ansi_escapes
from gnpy.api.exception.path_computation_error import PathComputationError
from gnpy.api.service import config_service
from gnpy.api.service.encryption_service import EncryptionService
from gnpy.core.network import build_network
from gnpy.core.utils import lin2db, automatic_nch
from gnpy.tools.json_io import requests_from_json, disjunctions_from_json, network_to_json
from gnpy.topology.request import (compute_path_dsjctn, requests_aggregation,
correct_json_route_list,
deduplicate_disjunctions, compute_path_with_disjunction)
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
_logger = logging.getLogger(__name__)
class PathRequestService:
def __init__(self, encryption_service: EncryptionService):
self.encryption = encryption_service
def path_requests_run(self, service, network, equipment):
# Build the network once using the default power defined in SI in eqpt config
# TODO power density: db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
# spacing, f_min and f_max
p_db = equipment['SI']['default'].power_dbm
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
build_network(network, equipment, p_db, p_total_db)
path_computation_identifier = str(uuid.uuid4())
autodesign_dir = config_service.get_autodesign_dir()
with(open(os.path.join(autodesign_dir, '.'.join([path_computation_identifier, 'json'])), 'wb')) as file:
file.write(self.encryption.encrypt(json.dumps(network_to_json(network)).encode()))
oms_list = build_oms_list(network, equipment)
rqs = requests_from_json(service, equipment)
# check that request ids are unique. Non unique ids, may
# mess the computation: better to stop the computation
all_ids = [r.request_id for r in rqs]
if len(all_ids) != len(set(all_ids)):
for item in list(set(all_ids)):
all_ids.remove(item)
msg = f'Requests id {all_ids} are not unique'
_logger.critical(msg)
raise ValueError('Requests id ' + all_ids + ' are not unique')
rqs = correct_json_route_list(network, rqs)
# pths = compute_path(network, equipment, rqs)
dsjn = disjunctions_from_json(service)
# need to warn or correct in case of wrong disjunction form
# disjunction must not be repeated with same or different ids
dsjn = deduplicate_disjunctions(dsjn)
rqs, dsjn = requests_aggregation(rqs, dsjn)
# TODO export novel set of aggregated demands in a json file
_logger.info(f'{ansi_escapes.blue}The following services have been requested:{ansi_escapes.reset}' + str(rqs))
_logger.info(f'{ansi_escapes.blue}Computing all paths with constraints{ansi_escapes.reset}')
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
_logger.info(f'{ansi_escapes.blue}Propagating on selected path{ansi_escapes.reset}')
propagatedpths, reversed_pths, reversed_propagatedpths = compute_path_with_disjunction(network, equipment, rqs,
pths)
# Note that deepcopy used in compute_path_with_disjunction returns
# a list of nodes which are not belonging to network (they are copies of the node objects).
# so there can not be propagation on these nodes.
pth_assign_spectrum(pths, rqs, oms_list, reversed_pths)
return propagatedpths, reversed_propagatedpths, rqs, path_computation_identifier
def get_autodesign(self, path_computation_id):
"""
Get the autodesign with id topology_id
@param path_computation_id:
@return: the autodesign in json format
"""
autodesign_dir = config_service.get_autodesign_dir()
autodesign_file = os.path.join(autodesign_dir, '.'.join([path_computation_id, 'json']))
if not os.path.exists(autodesign_file):
raise PathComputationError('Autodesign with id {} does not exist '.format(path_computation_id))
with(open(autodesign_file, 'rb')) as file:
return json.loads(self.encryption.decrypt(file.read()))
def delete_autodesign(self, path_computation_id: str):
"""
Delete autodesign with id equipment_id
@param path_computation_id:
"""
autodesign_dir = config_service.get_autodesign_dir()
autodesign_file = os.path.join(autodesign_dir, '.'.join([path_computation_id, 'json']))
if os.path.exists(autodesign_file):
os.remove(autodesign_file)

View File

@@ -0,0 +1,4 @@
[DIRECTORY]
topology: /opt/application/oopt-gnpy/topology
equipment: /opt/application/oopt-gnpy/equipment
autodesign: /opt/application/oopt-gnpy/autodesign

View File

@@ -0,0 +1,62 @@
# coding: utf-
import json
import os
import uuid
from gnpy.api.exception.topology_error import TopologyError
from gnpy.api.service import config_service
def save_topology(topology):
"""
Save topology to file.
@param topology: json content
@return: a UUID identifier to identify the topology
"""
topology_identifier = str(uuid.uuid4())
# TODO: validate json content
_write_topology(topology, topology_identifier)
return topology_identifier
def update_topology(topology, topology_identifier):
"""
Update topology with identifier topology_identifier.
@param topology_identifier: the identifier of the topology to be updated
@param topology: json content
@return: a UUID identifier to identify the topology
"""
# TODO: validate json content
_write_topology(topology, topology_identifier)
return topology_identifier
def _write_topology(topology, topology_identifier):
topology_dir = config_service.get_topology_dir()
with(open(os.path.join(topology_dir, '.'.join([topology_identifier, 'json'])), 'w')) as file:
json.dump(topology, file)
def get_topology(topology_id: str) -> dict:
"""
Get the topology with id topology_id
@param topology_id:
@return: the topology in json format
"""
topology_dir = config_service.get_topology_dir()
topology_file = os.path.join(topology_dir, '.'.join([topology_id, 'json']))
if not os.path.exists(topology_file):
raise TopologyError('Topology with id {} does not exist '.format(topology_id))
with(open(topology_file, 'r')) as file:
return json.load(file)
def delete_topology(topology_id: str):
"""
Delete topology with id topology_id
@param topology_id:
"""
topology_dir = config_service.get_topology_dir()
topology_file = os.path.join(topology_dir, '.'.join([topology_id, 'json']))
if os.path.exists(topology_file):
os.remove(topology_file)

View File

@@ -1,30 +1,9 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
########################################################################
# _____ ___ ____ ____ ____ _____ #
# |_ _|_ _| _ \ | _ \/ ___|| ____| #
# | | | || |_) | | |_) \___ \| _| #
# | | | || __/ | __/ ___) | |___ #
# |_| |___|_| |_| |____/|_____| #
# #
# == Physical Simulation Environment == #
# #
########################################################################
'''
gnpy route planning and optimization library
============================================
Simulation of signal propagation in the DWDM network
gnpy is a route planning and optimization library, written in Python, for
operators of large-scale mesh optical networks.
:copyright: © 2018, Telecom Infra Project
:license: BSD 3-Clause, see LICENSE for more details.
Optical signals, as defined via :class:`.info.SpectralInformation`, enter
:py:mod:`.elements` which compute how these signals are affected as they travel
through the :py:mod:`.network`.
The simulation is controlled via :py:mod:`.parameters` and implemented mainly
via :py:mod:`.science_utils`.
'''
from . import elements
from .execute import *
from .network import *
from .utils import *

View File

@@ -9,5 +9,7 @@ A random subset of ANSI terminal escape codes for colored messages
'''
red = '\x1b[1;31;40m'
blue = '\x1b[1;34;40m'
cyan = '\x1b[1;36;40m'
yellow = '\x1b[1;33;40m'
reset = '\x1b[0m'

View File

@@ -1,627 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.core.convert
=================
This module contains utilities for converting between XLS and JSON.
The input XLS file must contain sheets named "Nodes" and "Links".
It may optionally contain a sheet named "Eqpt".
In the "Nodes" sheet, only the "City" column is mandatory. The column "Type"
can be determined automatically given the topology (e.g., if degree 2, ILA;
otherwise, ROADM.) Incorrectly specified types (e.g., ILA for node of
degree ≠ 2) will be automatically corrected.
In the "Links" sheet, only the first three columns ("Node A", "Node Z" and
"east Distance (km)") are mandatory. Missing "west" information is copied from
the "east" information so that it is possible to input undirected data.
"""
from sys import exit
try:
from xlrd import open_workbook
except ModuleNotFoundError:
exit('Required: `pip install xlrd`')
from argparse import ArgumentParser
from collections import namedtuple, Counter, defaultdict
from itertools import chain
from json import dumps
from pathlib import Path
from difflib import get_close_matches
from gnpy.core.utils import silent_remove
import time
all_rows = lambda sh, start=0: (sh.row(x) for x in range(start, sh.nrows))
class Node(object):
def __init__(self, **kwargs):
super(Node, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k:v for k,v in kwargs.items() if v !=''}
for k,v in self.default_values.items():
v = clean_kwargs.get(k,v)
setattr(self, k, v)
default_values = \
{
'city': '',
'state': '',
'country': '',
'region': '',
'latitude': 0,
'longitude': 0,
'node_type': 'ILA',
'booster_restriction' : '',
'preamp_restriction' : ''
}
class Link(object):
"""attribtes from west parse_ept_headers dict
+node_a, node_z, west_fiber_con_in, east_fiber_con_in
"""
def __init__(self, **kwargs):
super(Link, self).__init__()
self.update_attr(kwargs)
self.distance_units = 'km'
def update_attr(self, kwargs):
clean_kwargs = {k:v for k,v in kwargs.items() if v !=''}
for k,v in self.default_values.items():
v = clean_kwargs.get(k,v)
setattr(self, k, v)
k = 'west' + k.split('east')[-1]
v = clean_kwargs.get(k,v)
setattr(self, k, v)
def __eq__(self, link):
return (self.from_city == link.from_city and self.to_city == link.to_city) \
or (self.from_city == link.to_city and self.to_city == link.from_city)
default_values = \
{
'from_city': '',
'to_city': '',
'east_distance': 80,
'east_fiber': 'SSMF',
'east_lineic': 0.2,
'east_con_in': None,
'east_con_out': None,
'east_pmd': 0.1,
'east_cable': ''
}
class Eqpt(object):
def __init__(self, **kwargs):
super(Eqpt, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k:v for k,v in kwargs.items() if v !=''}
for k,v in self.default_values.items():
v_east = clean_kwargs.get(k,v)
setattr(self, k, v_east)
k = 'west' + k.split('east')[-1]
v_west = clean_kwargs.get(k,v)
setattr(self, k, v_west)
default_values = \
{
'from_city': '',
'to_city': '',
'east_amp_type': '',
'east_att_in': 0,
'east_amp_gain': None,
'east_amp_dp': None,
'east_tilt': 0,
'east_att_out': None
}
def read_header(my_sheet, line, slice_):
""" return the list of headers !:= ''
header_i = [(header, header_column_index), ...]
in a {line, slice1_x, slice_y} range
"""
Param_header = namedtuple('Param_header', 'header colindex')
try:
header = [x.value.strip() for x in my_sheet.row_slice(line, slice_[0], slice_[1])]
header_i = [Param_header(header,i+slice_[0]) for i, header in enumerate(header) if header != '']
except Exception:
header_i = []
if header_i != [] and header_i[-1].colindex != slice_[1]:
header_i.append(Param_header('',slice_[1]))
return header_i
def read_slice(my_sheet, line, slice_, header):
"""return the slice range of a given header
in a defined range {line, slice_x, slice_y}"""
header_i = read_header(my_sheet, line, slice_)
slice_range = (-1,-1)
if header_i != []:
try:
slice_range = next((h.colindex,header_i[i+1].colindex) \
for i,h in enumerate(header_i) if header in h.header)
except Exception:
pass
return slice_range
def parse_headers(my_sheet, input_headers_dict, headers, start_line, slice_in):
"""return a dict of header_slice
key = column index
value = header name"""
for h0 in input_headers_dict:
slice_out = read_slice(my_sheet, start_line, slice_in, h0)
iteration = 1
while slice_out == (-1,-1) and iteration < 10:
#try next lines
#print(h0, iteration)
slice_out = read_slice(my_sheet, start_line+iteration, slice_in, h0)
iteration += 1
if slice_out == (-1, -1):
if h0 in ('east', 'Node A', 'Node Z', 'City') :
print(f'\x1b[1;31;40m'+f'CRITICAL: missing _{h0}_ header: EXECUTION ENDS'+ '\x1b[0m')
exit()
else:
print(f'missing header {h0}')
elif not isinstance(input_headers_dict[h0], dict):
headers[slice_out[0]] = input_headers_dict[h0]
else:
headers = parse_headers(my_sheet, input_headers_dict[h0], headers, start_line+1, slice_out)
if headers == {}:
print(f'\x1b[1;31;40m'+f'CRITICAL ERROR: could not find any header to read _ ABORT'+ '\x1b[0m')
exit()
return headers
def parse_row(row, headers):
#print([label for label in ept.values()])
#print([i for i in ept.keys()])
#print(row[i for i in ept.keys()])
return {f: r.value for f, r in \
zip([label for label in headers.values()], [row[i] for i in headers])}
#if r.ctype != XL_CELL_EMPTY}
def parse_sheet(my_sheet, input_headers_dict, header_line, start_line, column):
headers = parse_headers(my_sheet, input_headers_dict, {}, header_line, (0,column))
for row in all_rows(my_sheet, start=start_line):
yield parse_row(row[0: column], headers)
def sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city):
duplicate_links = []
for l1 in links:
for l2 in links:
if l1 is not l2 and l1 == l2 and l2 not in duplicate_links:
print(f'\nWARNING\n \
link {l1.from_city}-{l1.to_city} is duplicate \
\nthe 1st duplicate link will be removed but you should check Links sheet input')
duplicate_links.append(l1)
#if duplicate_links != []:
#time.sleep(3)
for l in duplicate_links:
links.remove(l)
try :
test_nodes = [n for n in nodes_by_city if not n in links_by_city]
test_links = [n for n in links_by_city if not n in nodes_by_city]
test_eqpts = [n for n in eqpts_by_city if not n in nodes_by_city]
assert (test_nodes == [] or test_nodes == [''])\
and (test_links == [] or test_links ==[''])\
and (test_eqpts == [] or test_eqpts ==[''])
except AssertionError:
print(f'CRITICAL error: \nNames in Nodes and Links sheets do no match, check:\
\n{test_nodes} in Nodes sheet\
\n{test_links} in Links sheet\
\n{test_eqpts} in Eqpt sheet')
exit(1)
for city,link in links_by_city.items():
if nodes_by_city[city].node_type.lower()=='ila' and len(link) != 2:
#wrong input: ILA sites can only be Degree 2
# => correct to make it a ROADM and remove entry in links_by_city
#TODO : put in log rather than print
print(f'invalid node type ({nodes_by_city[city].node_type})\
specified in {city}, replaced by ROADM')
nodes_by_city[city].node_type = 'ROADM'
for n in nodes:
if n.city==city:
n.node_type='ROADM'
return nodes, links
def convert_file(input_filename, names_matching=False, filter_region=[]):
nodes, links, eqpts = parse_excel(input_filename)
if filter_region:
nodes = [n for n in nodes if n.region.lower() in filter_region]
cities = {n.city for n in nodes}
links = [lnk for lnk in links if lnk.from_city in cities and
lnk.to_city in cities]
cities = {lnk.from_city for lnk in links} | {lnk.to_city for lnk in links}
nodes = [n for n in nodes if n.city in cities]
global nodes_by_city
nodes_by_city = {n.city: n for n in nodes}
#create matching dictionary for node name mismatch analysis
cities = {''.join(c.strip() for c in n.city.split('C+L')).lower(): n.city for n in nodes}
cities_to_match = [k for k in cities]
city_match_dic = defaultdict(list)
for city in cities:
if city in cities_to_match:
cities_to_match.remove(city)
matches = get_close_matches(city, cities_to_match, 4, 0.85)
for m in matches:
city_match_dic[cities[city]].append(cities[m])
#check lower case/upper case
for city in nodes_by_city:
for match_city in nodes_by_city:
if match_city.lower() == city.lower() and match_city != city:
city_match_dic[city].append(match_city)
if names_matching:
print('\ncity match dictionary:',city_match_dic)
with open('name_match_dictionary.json', 'w', encoding='utf-8') as city_match_dic_file:
city_match_dic_file.write(dumps(city_match_dic, indent=2, ensure_ascii=False))
global links_by_city
links_by_city = defaultdict(list)
for link in links:
links_by_city[link.from_city].append(link)
links_by_city[link.to_city].append(link)
global eqpts_by_city
eqpts_by_city = defaultdict(list)
for eqpt in eqpts:
eqpts_by_city[eqpt.from_city].append(eqpt)
nodes, links = sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city)
data = {
'elements':
[{'uid': f'trx {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Transceiver'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'] +
[{'uid': f'roadm {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm' \
and x.booster_restriction == '' and x.preamp_restriction == ''] +
[{'uid': f'roadm {x.city}',
'params' : {
'restrictions': {
'preamp_variety_list': silent_remove(x.preamp_restriction.split(' | '),''),
'booster_variety_list': silent_remove(x.booster_restriction.split(' | '),'')
}
},
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm' and \
(x.booster_restriction != '' or x.preamp_restriction != '')] +
[{'uid': f'west fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'east fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'fiber ({x.from_city} \u2192 {x.to_city})-{x.east_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.east_fiber,
'params': {'length': round(x.east_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.east_lineic,
'con_in':x.east_con_in,
'con_out':x.east_con_out}
}
for x in links] +
[{'uid': f'fiber ({x.to_city} \u2192 {x.from_city})-{x.west_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.west_fiber,
'params': {'length': round(x.west_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.west_lineic,
'con_in':x.west_con_in,
'con_out':x.west_con_out}
} # missing ILA construction
for x in links] +
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.east_amp_type,
'operational': {'gain_target': e.east_amp_gain,
'delta_p': e.east_amp_dp,
'tilt_target': e.east_tilt,
'out_voa' : e.east_att_out}
}
for e in eqpts if (e.east_amp_type.lower() != '' and \
e.east_amp_type.lower() != 'fused')] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.west_amp_type,
'operational': {'gain_target': e.west_amp_gain,
'delta_p': e.west_amp_dp,
'tilt_target': e.west_tilt,
'out_voa' : e.west_att_out}
}
for e in eqpts if (e.west_amp_type.lower() != '' and \
e.west_amp_type.lower() != 'fused')] +
# fused edfa variety is a hack to indicate that there should not be
# booster amplifier out the roadm.
# If user specifies ILA in Nodes sheet and fused in Eqpt sheet, then assumes that
# this is a fused nodes.
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.east_amp_type.lower() == 'fused'] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.west_amp_type.lower() == 'fused'],
'connections':
list(chain.from_iterable([eqpt_connection_by_city(n.city)
for n in nodes]))
+
list(chain.from_iterable(zip(
[{'from_node': f'trx {x.city}',
'to_node': f'roadm {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower()=='roadm'],
[{'from_node': f'roadm {x.city}',
'to_node': f'trx {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower()=='roadm'])))
}
suffix_filename = str(input_filename.suffixes[0])
full_input_filename = str(input_filename)
split_filename = [full_input_filename[0:len(full_input_filename)-len(suffix_filename)] , suffix_filename[1:]]
output_json_file_name = split_filename[0]+'.json'
with open(output_json_file_name, 'w', encoding='utf-8') as edfa_json_file:
edfa_json_file.write(dumps(data, indent=2, ensure_ascii=False))
return output_json_file_name
def parse_excel(input_filename):
link_headers = \
{ 'Node A': 'from_city',
'Node Z': 'to_city',
'east':{
'Distance (km)': 'east_distance',
'Fiber type': 'east_fiber',
'lineic att': 'east_lineic',
'Con_in': 'east_con_in',
'Con_out': 'east_con_out',
'PMD': 'east_pmd',
'Cable id': 'east_cable'
},
'west':{
'Distance (km)': 'west_distance',
'Fiber type': 'west_fiber',
'lineic att': 'west_lineic',
'Con_in': 'west_con_in',
'Con_out': 'west_con_out',
'PMD': 'west_pmd',
'Cable id': 'west_cable'
}
}
node_headers = \
{ 'City': 'city',
'State': 'state',
'Country': 'country',
'Region': 'region',
'Latitude': 'latitude',
'Longitude': 'longitude',
'Type': 'node_type',
'Booster_restriction': 'booster_restriction',
'Preamp_restriction': 'preamp_restriction'
}
eqpt_headers = \
{ 'Node A': 'from_city',
'Node Z': 'to_city',
'east':{
'amp type': 'east_amp_type',
'att_in': 'east_att_in',
'amp gain': 'east_amp_gain',
'delta p': 'east_amp_dp',
'tilt': 'east_tilt',
'att_out': 'east_att_out'
},
'west':{
'amp type': 'west_amp_type',
'att_in': 'west_att_in',
'amp gain': 'west_amp_gain',
'delta p': 'west_amp_dp',
'tilt': 'west_tilt',
'att_out': 'west_att_out'
}
}
with open_workbook(input_filename) as wb:
nodes_sheet = wb.sheet_by_name('Nodes')
links_sheet = wb.sheet_by_name('Links')
try:
eqpt_sheet = wb.sheet_by_name('Eqpt')
except Exception:
#eqpt_sheet is optional
eqpt_sheet = None
nodes = []
for node in parse_sheet(nodes_sheet, node_headers, NODES_LINE, NODES_LINE+1, NODES_COLUMN):
nodes.append(Node(**node))
expected_node_types = {'ROADM', 'ILA', 'FUSED'}
for n in nodes:
if n.node_type not in expected_node_types:
n.node_type = 'ILA'
links = []
for link in parse_sheet(links_sheet, link_headers, LINKS_LINE, LINKS_LINE+2, LINKS_COLUMN):
links.append(Link(**link))
#print('\n', [l.__dict__ for l in links])
eqpts = []
if eqpt_sheet != None:
for eqpt in parse_sheet(eqpt_sheet, eqpt_headers, EQPTS_LINE, EQPTS_LINE+2, EQPTS_COLUMN):
eqpts.append(Eqpt(**eqpt))
# sanity check
all_cities = Counter(n.city for n in nodes)
if len(all_cities) != len(nodes):
raise ValueError(f'Duplicate city: {all_cities}')
if any(ln.from_city not in all_cities or
ln.to_city not in all_cities for ln in links):
raise ValueError(f'Bad link.')
return nodes, links, eqpts
def eqpt_connection_by_city(city_name):
other_cities = fiber_dest_from_source(city_name)
subdata = []
if nodes_by_city[city_name].node_type.lower() in {'ila', 'fused'}:
# Then len(other_cities) == 2
direction = ['west', 'east']
for i in range(2):
from_ = fiber_link(other_cities[i], city_name)
in_ = eqpt_in_city_to_city(city_name, other_cities[0],direction[i])
to_ = fiber_link(city_name, other_cities[1-i])
subdata += connect_eqpt(from_, in_, to_)
elif nodes_by_city[city_name].node_type.lower() == 'roadm':
for other_city in other_cities:
from_ = f'roadm {city_name}'
in_ = eqpt_in_city_to_city(city_name, other_city)
to_ = fiber_link(city_name, other_city)
subdata += connect_eqpt(from_, in_, to_)
from_ = fiber_link(other_city, city_name)
in_ = eqpt_in_city_to_city(city_name, other_city, "west")
to_ = f'roadm {city_name}'
subdata += connect_eqpt(from_, in_, to_)
return subdata
def connect_eqpt(from_, in_, to_):
connections = []
if in_ !='':
connections = [{'from_node': from_, 'to_node': in_},
{'from_node': in_, 'to_node': to_}]
else:
connections = [{'from_node': from_, 'to_node': to_}]
return connections
def eqpt_in_city_to_city(in_city, to_city, direction='east'):
rev_direction = 'west' if direction == 'east' else 'east'
amp_direction = f'{direction}_amp_type'
amp_rev_direction = f'{rev_direction}_amp_type'
return_eqpt = ''
if in_city in eqpts_by_city:
for e in eqpts_by_city[in_city]:
if nodes_by_city[in_city].node_type.lower() == 'roadm':
if e.to_city == to_city and getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
elif nodes_by_city[in_city].node_type.lower() == 'ila':
if e.to_city != to_city:
direction = rev_direction
amp_direction = amp_rev_direction
if getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
if nodes_by_city[in_city].node_type.lower() == 'fused':
return_eqpt = f'{direction} fused spans in {in_city}'
return return_eqpt
def fiber_dest_from_source(city_name):
destinations = []
links_from_city = links_by_city[city_name]
for l in links_from_city:
if l.from_city == city_name:
destinations.append(l.to_city)
else:
destinations.append(l.from_city)
return destinations
def fiber_link(from_city, to_city):
source_dest = (from_city, to_city)
link = links_by_city[from_city]
l = next(l for l in link if l.from_city in source_dest and l.to_city in source_dest)
if l.from_city == from_city:
fiber = f'fiber ({l.from_city} \u2192 {l.to_city})-{l.east_cable}'
else:
fiber = f'fiber ({l.to_city} \u2192 {l.from_city})-{l.west_cable}'
return fiber
def midpoint(city_a, city_b):
lats = city_a.latitude, city_b.latitude
longs = city_a.longitude, city_b.longitude
try:
result = {
'latitude': sum(lats) / 2,
'longitude': sum(longs) / 2
}
except :
result = {
'latitude': 0,
'longitude': 0
}
return result
#output_json_file_name = 'coronet_conus_example.json'
#TODO get column size automatically from tupple size
NODES_COLUMN = 10
NODES_LINE = 4
LINKS_COLUMN = 16
LINKS_LINE = 3
EQPTS_LINE = 3
EQPTS_COLUMN = 14
parser = ArgumentParser()
parser.add_argument('workbook', nargs='?', type=Path , default='meshTopologyExampleV2.xls')
parser.add_argument('-f', '--filter-region', action='append', default=[])
if __name__ == '__main__':
args = parser.parse_args()
convert_file(args.workbook, args.filter_region)

File diff suppressed because it is too large Load Diff

View File

@@ -8,261 +8,9 @@ gnpy.core.equipment
This module contains functionality for specifying equipment.
'''
from numpy import clip, polyval
from operator import itemgetter
from math import isclose
from pathlib import Path
from json import load
from gnpy.core.utils import lin2db, db2lin, load_json
from collections import namedtuple
from gnpy.core.elements import Edfa
from gnpy.core.utils import automatic_nch, db2lin
from gnpy.core.exceptions import EquipmentConfigError
import time
Model_vg = namedtuple('Model_vg', 'nf1 nf2 delta_p')
Model_fg = namedtuple('Model_fg', 'nf0')
Model_openroadm = namedtuple('Model_openroadm', 'nf_coef')
Model_hybrid = namedtuple('Model_hybrid', 'nf_ram gain_ram edfa_variety')
Model_dual_stage = namedtuple('Model_dual_stage', 'preamp_variety booster_variety')
class common:
def update_attr(self, default_values, kwargs, name):
clean_kwargs = {k:v for k, v in kwargs.items() if v != ''}
for k, v in default_values.items():
setattr(self, k, clean_kwargs.get(k, v))
if k not in clean_kwargs and name != 'Amp':
print(f'\x1b[1;31;40m'+
f'\n WARNING missing {k} attribute in eqpt_config.json[{name}]'+
f'\n default value is {k} = {v}'+
f'\x1b[0m')
time.sleep(1)
class SI(common):
default_values =\
{
"f_min": 191.35e12,
"f_max": 196.1e12,
"baud_rate": 32e9,
"spacing": 50e9,
"power_dbm": 0,
"power_range_db": [0, 0, 0.5],
"roll_off": 0.15,
"tx_osnr": 45,
"sys_margins": 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'SI')
class Span(common):
default_values = \
{
'power_mode': True,
'delta_power_range_db': None,
'max_fiber_lineic_loss_for_raman': 0.25,
'target_extended_gain': 2.5,
'max_length': 150,
'length_units': 'km',
'max_loss': None,
'padding': 10,
'EOL': 0,
'con_in': 0,
'con_out': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Span')
class Roadm(common):
default_values = \
{
'target_pch_out_db': -17,
'add_drop_osnr': 100,
'restrictions': {
'preamp_variety_list':[],
'booster_variety_list':[]
}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Roadm')
class Transceiver(common):
default_values = \
{
'type_variety': None,
'frequency': None,
'mode': {}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Transceiver')
class Fiber(common):
default_values = \
{
'type_variety': '',
'dispersion': None,
'gamma': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Fiber')
class RamanFiber(common):
default_values = \
{
'type_variety': '',
'dispersion': None,
'gamma': 0,
'raman_efficiency': None
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'RamanFiber')
for param in ('cr', 'frequency_offset'):
if param not in self.raman_efficiency:
raise EquipmentConfigError(f'RamanFiber.raman_efficiency: missing "{param}" parameter')
if self.raman_efficiency['frequency_offset'] != sorted(self.raman_efficiency['frequency_offset']):
raise EquipmentConfigError(f'RamanFiber.raman_efficiency.frequency_offset is not sorted')
class Amp(common):
default_values = \
{
'f_min': 191.35e12,
'f_max': 196.1e12,
'type_variety': '',
'type_def': '',
'gain_flatmax': None,
'gain_min': None,
'p_max': None,
'nf_model': None,
'dual_stage_model': None,
'nf_fit_coeff': None,
'nf_ripple': None,
'dgt': None,
'gain_ripple': None,
'out_voa_auto': False,
'allowed_for_design': False,
'raman': False
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Amp')
@classmethod
def from_json(cls, filename, **kwargs):
config = Path(filename).parent / 'default_edfa_config.json'
type_variety = kwargs['type_variety']
type_def = kwargs.get('type_def', 'variable_gain') # default compatibility with older json eqpt files
nf_def = None
dual_stage_def = None
if type_def == 'fixed_gain':
try:
nf0 = kwargs.pop('nf0')
except KeyError: #nf0 is expected for a fixed gain amp
raise EquipmentConfigError(f'missing nf0 value input for amplifier: {type_variety} in equipment config')
for k in ('nf_min', 'nf_max'):
try:
del kwargs[k]
except KeyError:
pass
nf_def = Model_fg(nf0)
elif type_def == 'advanced_model':
config = Path(filename).parent / kwargs.pop('advanced_config_from_json')
elif type_def == 'variable_gain':
gain_min, gain_max = kwargs['gain_min'], kwargs['gain_flatmax']
try: #nf_min and nf_max are expected for a variable gain amp
nf_min = kwargs.pop('nf_min')
nf_max = kwargs.pop('nf_max')
except KeyError:
raise EquipmentConfigError(f'missing nf_min or nf_max value input for amplifier: {type_variety} in equipment config')
try: #remove all remaining nf inputs
del kwargs['nf0']
except KeyError: pass #nf0 is not needed for variable gain amp
nf1, nf2, delta_p = nf_model(type_variety, gain_min, gain_max, nf_min, nf_max)
nf_def = Model_vg(nf1, nf2, delta_p)
elif type_def == 'openroadm':
try:
nf_coef = kwargs.pop('nf_coef')
except KeyError: #nf_coef is expected for openroadm amp
raise EquipmentConfigError(f'missing nf_coef input for amplifier: {type_variety} in equipment config')
nf_def = Model_openroadm(nf_coef)
elif type_def == 'dual_stage':
try: #nf_ram and gain_ram are expected for a hybrid amp
preamp_variety = kwargs.pop('preamp_variety')
booster_variety = kwargs.pop('booster_variety')
except KeyError:
raise EquipmentConfigError(f'missing preamp/booster variety input for amplifier: {type_variety} in equipment config')
dual_stage_def = Model_dual_stage(preamp_variety, booster_variety)
with open(config, encoding='utf-8') as f:
json_data = load(f)
return cls(**{**kwargs, **json_data,
'nf_model': nf_def, 'dual_stage_model': dual_stage_def})
def nf_model(type_variety, gain_min, gain_max, nf_min, nf_max):
if nf_min < -10:
raise EquipmentConfigError(f'Invalid nf_min value {nf_min!r} for amplifier {type_variety}')
if nf_max < -10:
raise EquipmentConfigError(f'Invalid nf_max value {nf_max!r} for amplifier {type_variety}')
# NF estimation model based on nf_min and nf_max
# delta_p: max power dB difference between first and second stage coils
# dB g1a: first stage gain - internal VOA attenuation
# nf1, nf2: first and second stage coils
# calculated by solving nf_{min,max} = nf1 + nf2 / g1a{min,max}
delta_p = 5
g1a_min = gain_min - (gain_max - gain_min) - delta_p
g1a_max = gain_max - delta_p
nf2 = lin2db((db2lin(nf_min) - db2lin(nf_max)) /
(1/db2lin(g1a_max) - 1/db2lin(g1a_min)))
nf1 = lin2db(db2lin(nf_min) - db2lin(nf2)/db2lin(g1a_max))
if nf1 < 4:
raise EquipmentConfigError(f'First coil value too low {nf1} for amplifier {type_variety}')
# Check 1 dB < delta_p < 6 dB to ensure nf_min and nf_max values make sense.
# There shouldn't be high nf differences between the two coils:
# nf2 should be nf1 + 0.3 < nf2 < nf1 + 2
# If not, recompute and check delta_p
if not nf1 + 0.3 < nf2 < nf1 + 2:
nf2 = clip(nf2, nf1 + 0.3, nf1 + 2)
g1a_max = lin2db(db2lin(nf2) / (db2lin(nf_min) - db2lin(nf1)))
delta_p = gain_max - g1a_max
g1a_min = gain_min - (gain_max-gain_min) - delta_p
if not 1 < delta_p < 11:
raise EquipmentConfigError(f'Computed \N{greek capital letter delta}P invalid \
\n 1st coil vs 2nd coil calculated DeltaP {delta_p:.2f} for \
\n amplifier {type_variety} is not valid: revise inputs \
\n calculated 1st coil NF = {nf1:.2f}, 2nd coil NF = {nf2:.2f}')
# Check calculated values for nf1 and nf2
calc_nf_min = lin2db(db2lin(nf1) + db2lin(nf2)/db2lin(g1a_max))
if not isclose(nf_min, calc_nf_min, abs_tol=0.01):
raise EquipmentConfigError(f'nf_min does not match calc_nf_min, {nf_min} vs {calc_nf_min} for amp {type_variety}')
calc_nf_max = lin2db(db2lin(nf1) + db2lin(nf2)/db2lin(g1a_min))
if not isclose(nf_max, calc_nf_max, abs_tol=0.01):
raise EquipmentConfigError(f'nf_max does not match calc_nf_max, {nf_max} vs {calc_nf_max} for amp {type_variety}')
return nf1, nf2, delta_p
def edfa_nf(gain_target, variety_type, equipment):
amp_params = equipment['Edfa'][variety_type]
amp = Edfa(
uid = f'calc_NF',
params = amp_params.__dict__,
operational = {
'gain_target': gain_target,
'tilt_target': 0
}
)
amp.pin_db = 0
amp.nch = 88
return amp._calc_nf(True)
def trx_mode_params(equipment, trx_type_variety='', trx_mode='', error_message=False):
"""return the trx and SI parameters from eqpt_config for a given type_variety and mode (ie format)"""
@@ -271,38 +19,38 @@ def trx_mode_params(equipment, trx_type_variety='', trx_mode='', error_message=F
try:
trxs = equipment['Transceiver']
#if called from path_requests_run.py, trx_mode is filled with None when not specified by user
#if called from transmission_main.py, trx_mode is ''
# if called from path_requests_run.py, trx_mode is filled with None when not specified by user
# if called from transmission_main.py, trx_mode is ''
if trx_mode is not None:
mode_params = next(mode for trx in trxs \
if trx == trx_type_variety \
for mode in trxs[trx].mode \
if mode['format'] == trx_mode)
mode_params = next(mode for trx in trxs
if trx == trx_type_variety
for mode in trxs[trx].mode
if mode['format'] == trx_mode)
trx_params = {**mode_params}
# sanity check: spacing baudrate must be smaller than min spacing
if trx_params['baud_rate'] > trx_params['min_spacing'] :
raise EquipmentConfigError(f'Inconsistency in equipment library:\n Transpoder "{trx_type_variety}" mode "{trx_params["format"]}" '+\
f'has baud rate: {trx_params["baud_rate"]*1e-9} GHz greater than min_spacing {trx_params["min_spacing"]*1e-9}.')
if trx_params['baud_rate'] > trx_params['min_spacing']:
raise EquipmentConfigError(f'Inconsistency in equipment library:\n Transpoder "{trx_type_variety}" mode "{trx_params["format"]}" ' +
f'has baud rate {trx_params["baud_rate"]*1e-9} GHz greater than min_spacing {trx_params["min_spacing"]*1e-9}.')
else:
mode_params = {"format": "undetermined",
"baud_rate": None,
"OSNR": None,
"bit_rate": None,
"roll_off": None,
"tx_osnr":None,
"min_spacing":None,
"cost":None}
"tx_osnr": None,
"min_spacing": None,
"cost": None}
trx_params = {**mode_params}
trx_params['f_min'] = equipment['Transceiver'][trx_type_variety].frequency['min']
trx_params['f_max'] = equipment['Transceiver'][trx_type_variety].frequency['max']
# TODO: novel automatic feature maybe unwanted if spacing is specified
# trx_params['spacing'] = automatic_spacing(trx_params['baud_rate'])
# trx_params['spacing'] = _automatic_spacing(trx_params['baud_rate'])
# temp = trx_params['spacing']
# print(f'spacing {temp}')
except StopIteration :
except StopIteration:
if error_message:
raise EquipmentConfigError(f'Computation stoped: could not find tsp : {trx_type_variety} with mode: {trx_mode} in eqpt library')
raise EquipmentConfigError(f'Could not find transponder "{trx_type_variety}" with mode "{trx_mode}" in equipment library')
else:
# default transponder charcteristics
# mainly used with transmission_main_example.py
@@ -320,82 +68,6 @@ def trx_mode_params(equipment, trx_type_variety='', trx_mode='', error_message=F
trx_params['nb_channel'] = nch
print(f'There are {nch} channels propagating')
trx_params['power'] = db2lin(default_si_data.power_dbm)*1e-3
trx_params['power'] = db2lin(default_si_data.power_dbm) * 1e-3
return trx_params
def automatic_spacing(baud_rate):
"""return the min possible channel spacing for a given baud rate"""
# TODO : this should parametrized in a cfg file
# list of possible tuples [(max_baud_rate, spacing_for_this_baud_rate)]
spacing_list = [(33e9, 37.5e9), (38e9, 50e9), (50e9, 62.5e9), (67e9, 75e9), (92e9, 100e9)]
return min((s[1] for s in spacing_list if s[0] > baud_rate), default=baud_rate*1.2)
def automatic_nch(f_min, f_max, spacing):
return int((f_max - f_min)//spacing)
def automatic_fmax(f_min, spacing, nch):
return f_min + spacing * nch
def load_equipment(filename):
json_data = load_json(filename)
return equipment_from_json(json_data, filename)
def update_trx_osnr(equipment):
"""add sys_margins to all Transceivers OSNR values"""
for trx in equipment['Transceiver'].values():
for m in trx.mode:
m['OSNR'] = m['OSNR'] + equipment['SI']['default'].sys_margins
return equipment
def update_dual_stage(equipment):
edfa_dict = equipment['Edfa']
for edfa in edfa_dict.values():
if edfa.type_def == 'dual_stage':
edfa_preamp = edfa_dict[edfa.dual_stage_model.preamp_variety]
edfa_booster = edfa_dict[edfa.dual_stage_model.booster_variety]
for key, value in edfa_preamp.__dict__.items():
attr_k = 'preamp_' + key
setattr(edfa, attr_k, value)
for key, value in edfa_booster.__dict__.items():
attr_k = 'booster_' + key
setattr(edfa, attr_k, value)
edfa.p_max = edfa_booster.p_max
edfa.gain_flatmax = edfa_booster.gain_flatmax + edfa_preamp.gain_flatmax
if edfa.gain_min < edfa_preamp.gain_min:
raise EquipmentConfigError(f'Dual stage {edfa.type_variety} min gain is lower than its preamp min gain')
return equipment
def roadm_restrictions_sanity_check(equipment):
""" verifies that booster and preamp restrictions specified in roadm equipment are listed
in the edfa.
"""
restrictions = equipment['Roadm']['default'].restrictions['booster_variety_list'] + \
equipment['Roadm']['default'].restrictions['preamp_variety_list']
for amp_name in restrictions:
if amp_name not in equipment['Edfa']:
raise EquipmentConfigError(f'ROADM restriction {amp_name} does not refer to a defined EDFA name')
def equipment_from_json(json_data, filename):
"""build global dictionnary eqpt_library that stores all eqpt characteristics:
edfa type type_variety, fiber type_variety
from the eqpt_config.json (filename parameter)
also read advanced_config_from_json file parameters for edfa if they are available:
typically nf_ripple, dfg gain ripple, dgt and nf polynomial nf_fit_coeff
if advanced_config_from_json file parameter is not present: use nf_model:
requires nf_min and nf_max values boundaries of the edfa gain range
"""
equipment = {}
for key, entries in json_data.items():
equipment[key] = {}
typ = globals()[key]
for entry in entries:
subkey = entry.get('type_variety', 'default')
if key == 'Edfa':
equipment[key][subkey] = Amp.from_json(filename, **entry)
else:
equipment[key][subkey] = typ(**entry)
equipment = update_trx_osnr(equipment)
equipment = update_dual_stage(equipment)
roadm_restrictions_sanity_check(equipment)
return equipment

View File

@@ -12,8 +12,26 @@ Exceptions thrown by other gnpy modules
class ConfigurationError(Exception):
'''User-provided configuration contains an error'''
class EquipmentConfigError(ConfigurationError):
'''Incomplete or wrong configuration within the equipment library'''
class NetworkTopologyError(ConfigurationError):
'''Topology of user-provided network is wrong'''
class ServiceError(Exception):
'''Service of user-provided request is wrong'''
class DisjunctionError(ServiceError):
'''Disjunction of user-provided request can not be satisfied'''
class SpectrumError(Exception):
'''Spectrum errors of the program'''
class ParametersError(ConfigurationError):
'''Incomplete or wrong configurations within parameters json'''

View File

@@ -1,10 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.core.execute
=================
This module contains functions for executing the propogation of
spectral information on a `gnpy` network.
'''

View File

@@ -10,22 +10,28 @@ This module contains classes for modelling :class:`SpectralInformation`.
from collections import namedtuple
from numpy import array
from gnpy.core.utils import lin2db, db2lin
from json import loads
from gnpy.core.utils import load_json
from gnpy.core.equipment import automatic_nch, automatic_spacing
from gnpy.core.utils import automatic_nch, lin2db
class Power(namedtuple('Power', 'signal nli ase')):
"""carriers power in W"""
class Channel(namedtuple('Channel', 'channel_number frequency baud_rate roll_off power')):
pass
class Channel(namedtuple('Channel', 'channel_number frequency baud_rate roll_off power chromatic_dispersion pmd')):
""" Class containing the parameters of a WDM signal.
:param channel_number: channel number in the WDM grid
:param frequency: central frequency of the signal (Hz)
:param baud_rate: the symbol rate of the signal (Baud)
:param roll_off: the roll off of the signal. It is a pure number between 0 and 1
:param power (gnpy.core.info.Power): power of signal, ASE noise and NLI (W)
:param chromatic_dispersion: chromatic dispersion (s/m)
:param pmd: polarization mode dispersion (s)
"""
class Pref(namedtuple('Pref', 'p_span0, p_spani, neq_ch ')):
"""noiseless reference power in dBm:
"""noiseless reference power in dBm:
p_span0: inital target carrier power
p_spani: carrier power after element i
neq_ch: equivalent channel count in dB"""
@@ -44,29 +50,8 @@ def create_input_spectral_information(f_min, f_max, roll_off, baud_rate, power,
si = SpectralInformation(
pref=Pref(pref, pref, lin2db(nb_channel)),
carriers=[
Channel(f, (f_min+spacing*f),
baud_rate, roll_off, Power(power, 0, 0)) for f in range(1,nb_channel+1)
])
return si
if __name__ == '__main__':
pref = lin2db(power * 1e3)
si = SpectralInformation(
Pref(pref, pref),
Channel(1, 193.95e12, 32e9, 0.15, # 193.95 THz, 32 Gbaud
Power(1e-3, 1e-6, 1e-6)), # 1 mW, 1uW, 1uW
Channel(1, 195.95e12, 32e9, 0.15, # 195.95 THz, 32 Gbaud
Power(1.2e-3, 1e-6, 1e-6)), # 1.2 mW, 1uW, 1uW
Channel(f, (f_min + spacing * f),
baud_rate, roll_off, Power(power, 0, 0), 0, 0) for f in range(1, nb_channel + 1)
]
)
si = SpectralInformation()
spacing = 0.05 # THz
si = si._replace(carriers=tuple(Channel(f+1, 191.3+spacing*(f+1), 32e9, 0.15, Power(1e-3, f, 1)) for f in range(96)))
print(f'si = {si}')
print(f'si = {si.carriers[0].power.nli}')
print(f'si = {si.carriers[20].power.nli}')
si2 = si._replace(carriers=tuple(c._replace(power = c.power._replace(nli = c.power.nli * 1e5))
for c in si.carriers))
print(f'si2 = {si2}')
return si

View File

@@ -5,92 +5,31 @@
gnpy.core.network
=================
This module contains functions for constructing networks of network elements.
Working with networks which consist of network elements
'''
from gnpy.core.convert import convert_file
from networkx import DiGraph
from numpy import arange
from scipy.interpolate import interp1d
from logging import getLogger
from os import path
from operator import itemgetter, attrgetter
from gnpy.core import elements
from gnpy.core.elements import Fiber, Edfa, Transceiver, Roadm, Fused, RamanFiber
from gnpy.core.equipment import edfa_nf
from operator import attrgetter
from gnpy.core import ansi_escapes, elements
from gnpy.core.exceptions import ConfigurationError, NetworkTopologyError
from gnpy.core.units import UNITS
from gnpy.core.utils import (load_json, save_json, round2float, db2lin,
merge_amplifier_restrictions)
from gnpy.core.science_utils import SimParams
from gnpy.core.utils import round2float, convert_length
from collections import namedtuple
logger = getLogger(__name__)
def load_network(filename, equipment, name_matching = False):
json_filename = ''
if filename.suffix.lower() == '.xls':
logger.info('Automatically generating topology JSON file')
json_filename = convert_file(filename, name_matching)
elif filename.suffix.lower() == '.json':
json_filename = filename
else:
raise ValueError(f'unsuported topology filename extension {filename.suffix.lower()}')
json_data = load_json(json_filename)
return network_from_json(json_data, equipment)
def save_network(filename, network):
filename_output = path.splitext(filename)[0] + '_auto_design.json'
json_data = network_to_json(network)
save_json(json_data, filename_output)
def network_from_json(json_data, equipment):
# NOTE|dutc: we could use the following, but it would tie our data format
# too closely to the graph library
# from networkx import node_link_graph
g = DiGraph()
for el_config in json_data['elements']:
typ = el_config.pop('type')
variety = el_config.pop('type_variety', 'default')
if typ in equipment and variety in equipment[typ]:
extra_params = equipment[typ][variety]
temp = el_config.setdefault('params', {})
temp = merge_amplifier_restrictions(temp, extra_params.__dict__)
el_config['params'] = temp
elif typ in ['Edfa', 'Fiber']: # catch it now because the code will crash later!
raise ConfigurationError(f'The {typ} of variety type {variety} was not recognized:'
'\nplease check it is properly defined in the eqpt_config json file')
cls = getattr(elements, typ)
el = cls(**el_config)
g.add_node(el)
nodes = {k.uid: k for k in g.nodes()}
for cx in json_data['connections']:
from_node, to_node = cx['from_node'], cx['to_node']
try:
if isinstance(nodes[from_node], Fiber):
edge_length = nodes[from_node].params.length
else:
edge_length = 0.01
g.add_edge(nodes[from_node], nodes[to_node], weight = edge_length)
except KeyError:
raise NetworkTopologyError(f'can not find {from_node} or {to_node} defined in {cx}')
return g
def network_to_json(network):
data = {
'elements': [n.to_json for n in network]
def edfa_nf(gain_target, variety_type, equipment):
amp_params = equipment['Edfa'][variety_type]
amp = elements.Edfa(
uid='calc_NF',
params=amp_params.__dict__,
operational={
'gain_target': gain_target,
'tilt_target': 0
}
connections = {
'connections': [{"from_node": n.uid,
"to_node": next_n.uid}
for n in network
for next_n in network.successors(n) if next_n is not None]
}
data.update(connections)
return data
)
amp.pin_db = 0
amp.nch = 88
return amp._calc_nf(True)
def select_edfa(raman_allowed, gain_target, power_target, equipment, uid, restrictions=None):
"""amplifer selection algorithm
@@ -103,7 +42,7 @@ def select_edfa(raman_allowed, gain_target, power_target, equipment, uid, restri
# because main use case is to have specific radm amp which are not allowed for ILA
# with the auto design
edfa_dict = {name: amp for (name, amp) in equipment['Edfa'].items()
if restrictions is None or name in restrictions}
if restrictions is None or name in restrictions}
pin = power_target - gain_target
@@ -114,51 +53,49 @@ def select_edfa(raman_allowed, gain_target, power_target, equipment, uid, restri
# extended gain max allowance TARGET_EXTENDED_GAIN is coming from eqpt_config.json
# power attribut include power AND gain limitations
edfa_list = [Edfa_list(
variety=edfa_variety,
power=min(
pin
+edfa.gain_flatmax
+TARGET_EXTENDED_GAIN,
edfa.p_max
)
-power_target,
gain_min=
gain_target+3
-edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment)) \
for edfa_variety, edfa in edfa_dict.items()
if ((edfa.allowed_for_design or restrictions is not None) and not edfa.raman)]
variety=edfa_variety,
power=min(
pin
+ edfa.gain_flatmax
+ TARGET_EXTENDED_GAIN,
edfa.p_max
)
- power_target,
gain_min=gain_target + 3
- edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment))
for edfa_variety, edfa in edfa_dict.items()
if ((edfa.allowed_for_design or restrictions is not None) and not edfa.raman)]
#consider a Raman list because of different gain_min requirement:
#do not allow extended gain min for Raman
# consider a Raman list because of different gain_min requirement:
# do not allow extended gain min for Raman
raman_list = [Edfa_list(
variety=edfa_variety,
power=min(
pin
+edfa.gain_flatmax
+TARGET_EXTENDED_GAIN,
edfa.p_max
)
-power_target,
gain_min=
gain_target
-edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment))
for edfa_variety, edfa in edfa_dict.items()
if (edfa.allowed_for_design and edfa.raman)] \
if raman_allowed else []
variety=edfa_variety,
power=min(
pin
+ edfa.gain_flatmax
+ TARGET_EXTENDED_GAIN,
edfa.p_max
)
- power_target,
gain_min=gain_target
- edfa.gain_min,
nf=edfa_nf(gain_target, edfa_variety, equipment))
for edfa_variety, edfa in edfa_dict.items()
if (edfa.allowed_for_design and edfa.raman)] \
if raman_allowed else []
#merge raman and edfa lists
# merge raman and edfa lists
amp_list = edfa_list + raman_list
#filter on min gain limitation:
acceptable_gain_min_list = [x for x in amp_list if x.gain_min>0]
# filter on min gain limitation:
acceptable_gain_min_list = [x for x in amp_list if x.gain_min > 0]
if len(acceptable_gain_min_list) < 1:
#do not take this empty list into account for the rest of the code
#but issue a warning to the user and do not consider Raman
#Raman below min gain should not be allowed because i is meant to be a design requirement
#and raman padding at the amplifier input is impossible!
# do not take this empty list into account for the rest of the code
# but issue a warning to the user and do not consider Raman
# Raman below min gain should not be allowed because i is meant to be a design requirement
# and raman padding at the amplifier input is impossible!
if len(edfa_list) < 1:
raise ConfigurationError(f'auto_design could not find any amplifier \
@@ -167,48 +104,42 @@ def select_edfa(raman_allowed, gain_target, power_target, equipment, uid, restri
else:
# TODO: convert to logging
print(
f'\x1b[1;31;40m'\
+ f'WARNING: target gain in node {uid} is below all available amplifiers min gain: \
amplifier input padding will be assumed, consider increase span fiber padding instead'\
+ '\x1b[0m'
)
f'{ansi_escapes.red}WARNING:{ansi_escapes.reset} target gain in node {uid} is below all available amplifiers min gain: \
amplifier input padding will be assumed, consider increase span fiber padding instead'
)
acceptable_gain_min_list = edfa_list
#filter on gain+power limitation:
#this list checks both the gain and the power requirement
#because of the way .power is calculated in the list
acceptable_power_list = [x for x in acceptable_gain_min_list if x.power>0]
# filter on gain+power limitation:
# this list checks both the gain and the power requirement
# because of the way .power is calculated in the list
acceptable_power_list = [x for x in acceptable_gain_min_list if x.power > 0]
if len(acceptable_power_list) < 1:
#no amplifier satisfies the required power, so pick the highest power(s):
# no amplifier satisfies the required power, so pick the highest power(s):
power_max = max(acceptable_gain_min_list, key=attrgetter('power')).power
#check and pick if other amplifiers may have a similar gain/power
#allow a 0.3dB power range
#this allows to chose an amplifier with a better NF subsequentely
# check and pick if other amplifiers may have a similar gain/power
# allow a 0.3dB power range
# this allows to chose an amplifier with a better NF subsequentely
acceptable_power_list = [x for x in acceptable_gain_min_list
if x.power-power_max>-0.3]
if x.power - power_max > -0.3]
# gain and power requirements are resolved,
# =>chose the amp with the best NF among the acceptable ones:
selected_edfa = min(acceptable_power_list, key=attrgetter('nf')) #filter on NF
#check what are the gain and power limitations of this amp
power_reduction = round(min(selected_edfa.power, 0),2)
selected_edfa = min(acceptable_power_list, key=attrgetter('nf')) # filter on NF
# check what are the gain and power limitations of this amp
power_reduction = round(min(selected_edfa.power, 0), 2)
if power_reduction < -0.5:
print(
f'\x1b[1;31;40m'\
+ f'WARNING: target gain and power in node {uid}\n \
f'{ansi_escapes.red}WARNING:{ansi_escapes.reset} target gain and power in node {uid}\n \
is beyond all available amplifiers capabilities and/or extended_gain_range:\n\
a power reduction of {power_reduction} is applied\n'\
+ '\x1b[0m'
)
a power reduction of {power_reduction} is applied\n'
)
return selected_edfa.variety, power_reduction
def target_power(network, node, equipment): #get_fiber_dp
def target_power(network, node, equipment): # get_fiber_dp
SPAN_LOSS_REF = 20
POWER_SLOPE = 0.3
power_mode = equipment['Span']['default'].power_mode
dp_range = list(equipment['Span']['default'].delta_power_range_db)
node_loss = span_loss(network, node)
@@ -218,13 +149,14 @@ def target_power(network, node, equipment): #get_fiber_dp
dp = min(dp_range[1], dp)
except KeyError:
raise ConfigurationError(f'invalid delta_power_range_db definition in eqpt_config[Span]'
f'delta_power_range_db: [lower_bound, upper_bound, step]')
f'delta_power_range_db: [lower_bound, upper_bound, step]')
if isinstance(node, Roadm):
if isinstance(node, elements.Roadm):
dp = 0
return dp
def prev_node_generator(network, node):
"""fused spans interest:
iterate over all predecessors while they are Fused or Fiber type"""
@@ -233,12 +165,13 @@ def prev_node_generator(network, node):
except StopIteration:
raise NetworkTopologyError(f'Node {node.uid} is not properly connected, please check network topology')
# yield and re-iterate
if isinstance(prev_node, Fused) or isinstance(node, Fused):
if isinstance(prev_node, elements.Fused) or isinstance(node, elements.Fused):
yield prev_node
yield from prev_node_generator(network, prev_node)
else:
StopIteration
def next_node_generator(network, node):
"""fused spans interest:
iterate over all successors while they are Fused or Fiber type"""
@@ -247,30 +180,32 @@ def next_node_generator(network, node):
except StopIteration:
raise NetworkTopologyError('Node {node.uid} is not properly connected, please check network topology')
# yield and re-iterate
if isinstance(next_node, Fused) or isinstance(node, Fused):
if isinstance(next_node, elements.Fused) or isinstance(node, elements.Fused):
yield next_node
yield from next_node_generator(network, next_node)
else:
StopIteration
def span_loss(network, node):
"""Fused span interest:
return the total span loss of all the fibers spliced by a Fused node"""
loss = node.loss if node.passive else 0
try:
prev_node = next(n for n in network.predecessors(node))
if isinstance(prev_node, Fused):
if isinstance(prev_node, elements.Fused):
loss += sum(n.loss for n in prev_node_generator(network, node))
except StopIteration:
pass
try:
next_node = next(n for n in network.successors(node))
if isinstance(next_node, Fused):
if isinstance(next_node, elements.Fused):
loss += sum(n.loss for n in next_node_generator(network, node))
except StopIteration:
pass
return loss
def find_first_node(network, node):
"""Fused node interest:
returns the 1st node at the origin of a succession of fused nodes
@@ -280,6 +215,7 @@ def find_first_node(network, node):
pass
return this_node
def find_last_node(network, node):
"""Fused node interest:
returns the last node in a succession of fused nodes
@@ -289,29 +225,30 @@ def find_last_node(network, node):
pass
return this_node
def set_amplifier_voa(amp, power_target, power_mode):
VOA_MARGIN = 1 #do not maximize the VOA optimization
VOA_MARGIN = 1 # do not maximize the VOA optimization
if amp.out_voa is None:
if power_mode:
gain_target = amp.effective_gain
voa = min(amp.params.p_max-power_target,
amp.params.gain_flatmax-amp.effective_gain)
voa = min(amp.params.p_max - power_target,
amp.params.gain_flatmax - amp.effective_gain)
voa = max(round2float(max(voa, 0), 0.5) - VOA_MARGIN, 0) if amp.params.out_voa_auto else 0
amp.delta_p = amp.delta_p + voa
amp.effective_gain = amp.effective_gain + voa
else:
voa = 0 # no output voa optimization in gain mode
voa = 0 # no output voa optimization in gain mode
amp.out_voa = voa
def set_egress_amplifier(network, roadm, equipment, pref_total_db):
power_mode = equipment['Span']['default'].power_mode
next_oms = (n for n in network.successors(roadm) if not isinstance(n, Transceiver))
next_oms = (n for n in network.successors(roadm) if not isinstance(n, elements.Transceiver))
for oms in next_oms:
#go through all the OMS departing from the Roadm
# go through all the OMS departing from the Roadm
node = roadm
prev_node = roadm
next_node = oms
# if isinstance(next_node, Fused): #support ROADM wo egress amp for metro applications
# if isinstance(next_node, elements.Fused): #support ROADM wo egress amp for metro applications
# node = find_last_node(next_node)
# next_node = next(n for n in network.successors(node))
# next_node = find_last_node(next_node)
@@ -320,8 +257,8 @@ def set_egress_amplifier(network, roadm, equipment, pref_total_db):
prev_voa = 0
voa = 0
while True:
#go through all nodes in the OMS (loop until next Roadm instance)
if isinstance(node, Edfa):
# go through all nodes in the OMS (loop until next Roadm instance)
if isinstance(node, elements.Edfa):
node_loss = span_loss(network, prev_node)
voa = node.out_voa if node.out_voa else 0
if node.delta_p is None:
@@ -331,25 +268,25 @@ def set_egress_amplifier(network, roadm, equipment, pref_total_db):
gain_from_dp = node_loss + dp - prev_dp + prev_voa
if node.effective_gain is None or power_mode:
gain_target = gain_from_dp
else: #gain mode with effective_gain
else: # gain mode with effective_gain
gain_target = node.effective_gain
dp = prev_dp - node_loss + gain_target
power_target = pref_total_db + dp
power_target = pref_total_db + dp
raman_allowed = False
if isinstance(prev_node, Fiber):
if isinstance(prev_node, elements.Fiber):
max_fiber_lineic_loss_for_raman = \
equipment['Span']['default'].max_fiber_lineic_loss_for_raman
equipment['Span']['default'].max_fiber_lineic_loss_for_raman
raman_allowed = prev_node.params.loss_coef < max_fiber_lineic_loss_for_raman
# implementation of restrictions on roadm boosters
if isinstance(prev_node,Roadm):
if isinstance(prev_node, elements.Roadm):
if prev_node.restrictions['booster_variety_list']:
restrictions = prev_node.restrictions['booster_variety_list']
else:
restrictions = None
elif isinstance(next_node,Roadm):
elif isinstance(next_node, elements.Roadm):
# implementation of restrictions on roadm preamp
if next_node.restrictions['preamp_variety_list']:
restrictions = next_node.restrictions['preamp_variety_list']
@@ -358,25 +295,19 @@ def set_egress_amplifier(network, roadm, equipment, pref_total_db):
else:
restrictions = None
if node.params.type_variety == '':
edfa_variety, power_reduction = select_edfa(raman_allowed,
gain_target, power_target, equipment, node.uid, restrictions)
if node.params.type_variety == '':
edfa_variety, power_reduction = select_edfa(raman_allowed, gain_target, power_target, equipment, node.uid, restrictions)
extra_params = equipment['Edfa'][edfa_variety]
node.params.update_params(extra_params.__dict__)
dp += power_reduction
gain_target += power_reduction
elif node.params.raman and not raman_allowed:
print(
f'\x1b[1;31;40m'\
+ f'WARNING: raman is used in node {node.uid}\n \
but fiber lineic loss is above threshold\n'\
+ '\x1b[0m'
)
print(f'{ansi_escapes.red}WARNING{ansi_escapes.reset}: raman is used in node {node.uid}\n but fiber lineic loss is above threshold\n')
node.delta_p = dp if power_mode else None
node.effective_gain = gain_target
set_amplifier_voa(node, power_target, power_mode)
if isinstance(next_node, Roadm) or isinstance(next_node, Transceiver):
if isinstance(next_node, elements.Roadm) or isinstance(next_node, elements.Transceiver):
break
prev_dp = dp
prev_voa = voa
@@ -388,32 +319,32 @@ def set_egress_amplifier(network, roadm, equipment, pref_total_db):
def add_egress_amplifier(network, node):
next_nodes = [n for n in network.successors(node)
if not (isinstance(n, Transceiver) or isinstance(n, Fused) or isinstance(n, Edfa))]
#no amplification for fused spans or TRX
if not (isinstance(n, elements.Transceiver) or isinstance(n, elements.Fused) or isinstance(n, elements.Edfa))]
# no amplification for fused spans or TRX
for i, next_node in enumerate(next_nodes):
network.remove_edge(node, next_node)
amp = Edfa(
uid = f'Edfa{i}_{node.uid}',
params = {},
metadata = {
'location': {
'latitude': (node.lat * 2 + next_node.lat * 2) / 4,
'longitude': (node.lng * 2 + next_node.lng * 2) / 4,
'city': node.loc.city,
'region': node.loc.region,
}
},
operational = {
'gain_target': None,
'tilt_target': 0,
})
amp = elements.Edfa(
uid=f'Edfa{i}_{node.uid}',
params={},
metadata={
'location': {
'latitude': (node.lat * 2 + next_node.lat * 2) / 4,
'longitude': (node.lng * 2 + next_node.lng * 2) / 4,
'city': node.loc.city,
'region': node.loc.region,
}
},
operational={
'gain_target': None,
'tilt_target': 0,
})
network.add_node(amp)
if isinstance(node,Fiber):
if isinstance(node, elements.Fiber):
edgeweight = node.params.length
else:
edgeweight = 0.01
network.add_edge(node, amp, weight = edgeweight)
network.add_edge(amp, next_node, weight = 0.01)
network.add_edge(node, amp, weight=edgeweight)
network.add_edge(amp, next_node, weight=0.01)
def calculate_new_length(fiber_length, bounds, target_length):
@@ -422,17 +353,17 @@ def calculate_new_length(fiber_length, bounds, target_length):
n_spans = int(fiber_length // target_length)
length1 = fiber_length / (n_spans+1)
delta1 = target_length-length1
result1 = (length1, n_spans+1)
length1 = fiber_length / (n_spans + 1)
delta1 = target_length - length1
result1 = (length1, n_spans + 1)
length2 = fiber_length / n_spans
delta2 = length2-target_length
delta2 = length2 - target_length
result2 = (length2, n_spans)
if (bounds.start<=length1<=bounds.stop) and not(bounds.start<=length2<=bounds.stop):
if (bounds.start <= length1 <= bounds.stop) and not(bounds.start <= length2 <= bounds.stop):
result = result1
elif (bounds.start<=length2<=bounds.stop) and not(bounds.start<=length1<=bounds.stop):
elif (bounds.start <= length2 <= bounds.stop) and not(bounds.start <= length1 <= bounds.stop):
result = result2
else:
result = result1 if delta1 < delta2 else result2
@@ -441,7 +372,7 @@ def calculate_new_length(fiber_length, bounds, target_length):
def split_fiber(network, fiber, bounds, target_length, equipment):
new_length, n_spans = calculate_new_length(fiber.length, bounds, target_length)
new_length, n_spans = calculate_new_length(fiber.params.length, bounds, target_length)
if n_spans == 1:
return
@@ -453,80 +384,83 @@ def split_fiber(network, fiber, bounds, target_length, equipment):
network.remove_node(fiber)
fiber_params = fiber.params._asdict()
fiber_params['length'] = new_length / UNITS[fiber.params.length_units]
fiber_params['con_in'] = fiber.con_in
fiber_params['con_out'] = fiber.con_out
fiber.params.length = new_length
f = interp1d([prev_node.lng, next_node.lng], [prev_node.lat, next_node.lat])
xpos = [prev_node.lng + (next_node.lng - prev_node.lng) * (n+1)/(n_spans+1) for n in range(n_spans)]
xpos = [prev_node.lng + (next_node.lng - prev_node.lng) * (n + 1) / (n_spans + 1) for n in range(n_spans)]
ypos = f(xpos)
for span, lng, lat in zip(range(n_spans), xpos, ypos):
new_span = Fiber(uid = f'{fiber.uid}_({span+1}/{n_spans})',
metadata = {
'location': {
'latitude': lat,
'longitude': lng,
'city': fiber.loc.city,
'region': fiber.loc.region,
}
},
params = fiber_params)
if isinstance(prev_node,Fiber):
new_span = elements.Fiber(uid=f'{fiber.uid}_({span+1}/{n_spans})',
type_variety=fiber.type_variety,
metadata={
'location': {
'latitude': lat,
'longitude': lng,
'city': fiber.loc.city,
'region': fiber.loc.region,
}
},
params=fiber.params.asdict())
if isinstance(prev_node, elements.Fiber):
edgeweight = prev_node.params.length
else:
edgeweight = 0.01
network.add_edge(prev_node, new_span, weight = edgeweight)
network.add_edge(prev_node, new_span, weight=edgeweight)
prev_node = new_span
if isinstance(prev_node,Fiber):
if isinstance(prev_node, elements.Fiber):
edgeweight = prev_node.params.length
else:
edgeweight = 0.01
network.add_edge(prev_node, next_node, weight = edgeweight)
edgeweight = 0.01
network.add_edge(prev_node, next_node, weight=edgeweight)
def add_connector_loss(network, fibers, default_con_in, default_con_out, EOL):
for fiber in fibers:
if fiber.con_in is None: fiber.con_in = default_con_in
if fiber.con_out is None: fiber.con_out = default_con_out
if fiber.params.con_in is None:
fiber.params.con_in = default_con_in
if fiber.params.con_out is None:
fiber.params.con_out = default_con_out
next_node = next(n for n in network.successors(fiber))
if not isinstance(next_node, Fused):
fiber.con_out += EOL
if not isinstance(next_node, elements.Fused):
fiber.params.con_out += EOL
def add_fiber_padding(network, fibers, padding):
"""last_fibers = (fiber for n in network.nodes()
if not (isinstance(n, Fiber) or isinstance(n, Fused))
if not (isinstance(n, elements.Fiber) or isinstance(n, elements.Fused))
for fiber in network.predecessors(n)
if isinstance(fiber, Fiber))"""
if isinstance(fiber, elements.Fiber))"""
for fiber in fibers:
this_span_loss = span_loss(network, fiber)
try:
next_node = next(network.successors(fiber))
except StopIteration:
raise NetworkTopologyError(f'Fiber {fiber.uid} is not properly connected, please check network topology')
if this_span_loss < padding and not (isinstance(next_node, Fused)):
#add a padding att_in at the input of the 1st fiber:
#address the case when several fibers are spliced together
if this_span_loss < padding and not (isinstance(next_node, elements.Fused)):
# add a padding att_in at the input of the 1st fiber:
# address the case when several fibers are spliced together
first_fiber = find_first_node(network, fiber)
# in order to support no booster , fused might be placed
# just after a roadm: need to check that first_fiber is really a fiber
if isinstance(first_fiber,Fiber):
if first_fiber.att_in is None:
first_fiber.att_in = padding - this_span_loss
if isinstance(first_fiber, elements.Fiber):
if first_fiber.params.att_in is None:
first_fiber.params.att_in = padding - this_span_loss
else:
first_fiber.att_in = first_fiber.att_in + padding - this_span_loss
first_fiber.params.att_in = first_fiber.params.att_in + padding - this_span_loss
def build_network(network, equipment, pref_ch_db, pref_total_db):
default_span_data = equipment['Span']['default']
max_length = int(default_span_data.max_length * UNITS[default_span_data.length_units])
min_length = max(int(default_span_data.padding/0.2*1e3),50_000)
max_length = int(convert_length(default_span_data.max_length, default_span_data.length_units))
min_length = max(int(default_span_data.padding / 0.2 * 1e3), 50_000)
bounds = range(min_length, max_length)
target_length = max(min_length, 90_000)
default_con_in = default_span_data.con_in
default_con_out = default_span_data.con_out
padding = default_span_data.padding
#set roadm loss for gain_mode before to build network
fibers = [f for f in network.nodes() if isinstance(f, Fiber)]
# set roadm loss for gain_mode before to build network
fibers = [f for f in network.nodes() if isinstance(f, elements.Fiber)]
add_connector_loss(network, fibers, default_con_in, default_con_out, default_span_data.EOL)
add_fiber_padding(network, fibers, padding)
# don't group split fiber and add amp in the same loop
@@ -534,27 +468,17 @@ def build_network(network, equipment, pref_ch_db, pref_total_db):
for fiber in fibers:
split_fiber(network, fiber, bounds, target_length, equipment)
amplified_nodes = [n for n in network.nodes()
if isinstance(n, Fiber) or isinstance(n, Roadm)]
amplified_nodes = [n for n in network.nodes() if isinstance(n, elements.Fiber) or isinstance(n, elements.Roadm)]
for node in amplified_nodes:
add_egress_amplifier(network, node)
roadms = [r for r in network.nodes() if isinstance(r, Roadm)]
roadms = [r for r in network.nodes() if isinstance(r, elements.Roadm)]
for roadm in roadms:
set_egress_amplifier(network, roadm, equipment, pref_total_db)
#support older json input topology wo Roadms:
# support older json input topology wo Roadms:
if len(roadms) == 0:
trx = [t for t in network.nodes() if isinstance(t, Transceiver)]
trx = [t for t in network.nodes() if isinstance(t, elements.Transceiver)]
for t in trx:
set_egress_amplifier(network, t, equipment, pref_total_db)
def load_sim_params(filename):
sim_params = load_json(filename)
return SimParams(params=sim_params)
def configure_network(network, sim_params):
for node in network.nodes:
if isinstance(node, RamanFiber):
node.sim_params = sim_params

View File

@@ -1,56 +0,0 @@
#! /bin/usr/python3
# -*- coding: utf-8 -*-
'''
gnpy.core.node
==============
This module contains the base class for a network element.
Strictly, a network element is any callable which accepts an immutable
:class:`.info.SpectralInformation` object and returns an :class:`.info.SpectralInformation` object
(a copy).
Network elements MUST implement two attributes .uid and .name representing a
unique identifier and a printable name.
This base class provides a more convenient way to define a network element
via subclassing.
'''
from uuid import uuid4
from collections import namedtuple
class Location(namedtuple('Location', 'latitude longitude city region')):
def __new__(cls, latitude=0, longitude=0, city=None, region=None):
return super().__new__(cls, latitude, longitude, city, region)
class Node:
def __init__(self, uid, name=None, params=None, metadata=None, operational=None):
if name is None:
name = uid
self.uid, self.name = uid, name
if metadata is None:
metadata = {'location': {}}
if metadata and not isinstance(metadata.get('location'), Location):
metadata['location'] = Location(**metadata.pop('location', {}))
self.params, self.metadata, self.operational = params, metadata, operational
@property
def coords(self):
return self.lng, self.lat
@property
def location(self):
return self.metadata['location']
loc = location
@property
def longitude(self):
return self.location.longitude
lng = longitude
@property
def latitude(self):
return self.location.latitude
lat = latitude

287
gnpy/core/parameters.py Normal file
View File

@@ -0,0 +1,287 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.core.parameters
====================
This module contains all parameters to configure standard network elements.
"""
from scipy.constants import c, pi
from numpy import squeeze, log10, exp
from gnpy.core.utils import db2lin, convert_length
from gnpy.core.exceptions import ParametersError
class Parameters:
def asdict(self):
class_dict = self.__class__.__dict__
instance_dict = self.__dict__
new_dict = {}
for key in class_dict:
if isinstance(class_dict[key], property):
new_dict[key] = instance_dict['_' + key]
return new_dict
class PumpParams(Parameters):
def __init__(self, power, frequency, propagation_direction):
self._power = power
self._frequency = frequency
self._propagation_direction = propagation_direction
@property
def power(self):
return self._power
@property
def frequency(self):
return self._frequency
@property
def propagation_direction(self):
return self._propagation_direction
class RamanParams(Parameters):
def __init__(self, **kwargs):
self._flag_raman = kwargs['flag_raman']
self._space_resolution = kwargs['space_resolution'] if 'space_resolution' in kwargs else None
self._tolerance = kwargs['tolerance'] if 'tolerance' in kwargs else None
@property
def flag_raman(self):
return self._flag_raman
@property
def space_resolution(self):
return self._space_resolution
@property
def tolerance(self):
return self._tolerance
class NLIParams(Parameters):
def __init__(self, **kwargs):
self._nli_method_name = kwargs['nli_method_name']
self._wdm_grid_size = kwargs['wdm_grid_size']
self._dispersion_tolerance = kwargs['dispersion_tolerance']
self._phase_shift_tolerance = kwargs['phase_shift_tolerance']
self._f_cut_resolution = None
self._f_pump_resolution = None
self._computed_channels = kwargs['computed_channels'] if 'computed_channels' in kwargs else None
@property
def nli_method_name(self):
return self._nli_method_name
@property
def wdm_grid_size(self):
return self._wdm_grid_size
@property
def dispersion_tolerance(self):
return self._dispersion_tolerance
@property
def phase_shift_tolerance(self):
return self._phase_shift_tolerance
@property
def f_cut_resolution(self):
return self._f_cut_resolution
@f_cut_resolution.setter
def f_cut_resolution(self, f_cut_resolution):
self._f_cut_resolution = f_cut_resolution
@property
def f_pump_resolution(self):
return self._f_pump_resolution
@f_pump_resolution.setter
def f_pump_resolution(self, f_pump_resolution):
self._f_pump_resolution = f_pump_resolution
@property
def computed_channels(self):
return self._computed_channels
class SimParams(Parameters):
def __init__(self, **kwargs):
try:
if 'nli_parameters' in kwargs:
self._nli_params = NLIParams(**kwargs['nli_parameters'])
else:
self._nli_params = None
if 'raman_parameters' in kwargs:
self._raman_params = RamanParams(**kwargs['raman_parameters'])
else:
self._raman_params = None
except KeyError as e:
raise ParametersError(f'Simulation parameters must include {e}. Configuration: {kwargs}')
@property
def nli_params(self):
return self._nli_params
@property
def raman_params(self):
return self._raman_params
class FiberParams(Parameters):
def __init__(self, **kwargs):
try:
self._length = convert_length(kwargs['length'], kwargs['length_units'])
# fixed attenuator for padding
self._att_in = kwargs['att_in'] if 'att_in' in kwargs else 0
# if not defined in the network json connector loss in/out
# the None value will be updated in network.py[build_network]
# with default values from eqpt_config.json[Spans]
self._con_in = kwargs['con_in'] if 'con_in' in kwargs else None
self._con_out = kwargs['con_out'] if 'con_out' in kwargs else None
if 'ref_wavelength' in kwargs:
self._ref_wavelength = kwargs['ref_wavelength']
self._ref_frequency = c / self.ref_wavelength
elif 'ref_frequency' in kwargs:
self._ref_frequency = kwargs['ref_frequency']
self._ref_wavelength = c / self.ref_frequency
else:
self._ref_wavelength = 1550e-9
self._ref_frequency = c / self.ref_wavelength
self._dispersion = kwargs['dispersion'] # s/m/m
self._dispersion_slope = kwargs['dispersion_slope'] if 'dispersion_slope' in kwargs else \
-2 * self._dispersion/self.ref_wavelength # s/m/m/m
self._beta2 = -(self.ref_wavelength ** 2) * self.dispersion / (2 * pi * c) # 1/(m * Hz^2)
# Eq. (3.23) in Abramczyk, Halina. "Dispersion phenomena in optical fibers." Virtual European University
# on Lasers. Available online: http://mitr.p.lodz.pl/evu/lectures/Abramczyk3.pdf
# (accessed on 25 March 2018) (2005).
self._beta3 = ((self.dispersion_slope - (4*pi*c/self.ref_wavelength**3) * self.beta2) /
(2*pi*c/self.ref_wavelength**2)**2)
self._gamma = kwargs['gamma'] # 1/W/m
self._pmd_coef = kwargs['pmd_coef'] # s/sqrt(m)
if type(kwargs['loss_coef']) == dict:
self._loss_coef = squeeze(kwargs['loss_coef']['loss_coef_power']) * 1e-3 # lineic loss dB/m
self._f_loss_ref = squeeze(kwargs['loss_coef']['frequency']) # Hz
else:
self._loss_coef = kwargs['loss_coef'] * 1e-3 # lineic loss dB/m
self._f_loss_ref = 193.5e12 # Hz
self._lin_attenuation = db2lin(self.length * self.loss_coef)
self._lin_loss_exp = self.loss_coef / (10 * log10(exp(1))) # linear power exponent loss Neper/m
self._effective_length = (1 - exp(- self.lin_loss_exp * self.length)) / self.lin_loss_exp
self._asymptotic_length = 1 / self.lin_loss_exp
# raman parameters (not compulsory)
self._raman_efficiency = kwargs['raman_efficiency'] if 'raman_efficiency' in kwargs else None
self._pumps_loss_coef = kwargs['pumps_loss_coef'] if 'pumps_loss_coef' in kwargs else None
except KeyError as e:
raise ParametersError(f'Fiber configurations json must include {e}. Configuration: {kwargs}')
@property
def length(self):
return self._length
@length.setter
def length(self, length):
"""length must be in m"""
self._length = length
@property
def att_in(self):
return self._att_in
@att_in.setter
def att_in(self, att_in):
self._att_in = att_in
@property
def con_in(self):
return self._con_in
@con_in.setter
def con_in(self, con_in):
self._con_in = con_in
@property
def con_out(self):
return self._con_out
@con_out.setter
def con_out(self, con_out):
self._con_out = con_out
@property
def dispersion(self):
return self._dispersion
@property
def dispersion_slope(self):
return self._dispersion_slope
@property
def gamma(self):
return self._gamma
@property
def pmd_coef(self):
return self._pmd_coef
@property
def ref_wavelength(self):
return self._ref_wavelength
@property
def ref_frequency(self):
return self._ref_frequency
@property
def beta2(self):
return self._beta2
@property
def beta3(self):
return self._beta3
@property
def loss_coef(self):
return self._loss_coef
@property
def f_loss_ref(self):
return self._f_loss_ref
@property
def lin_loss_exp(self):
return self._lin_loss_exp
@property
def lin_attenuation(self):
return self._lin_attenuation
@property
def effective_length(self):
return self._effective_length
@property
def asymptotic_length(self):
return self._asymptotic_length
@property
def raman_efficiency(self):
return self._raman_efficiency
@property
def pumps_loss_coef(self):
return self._pumps_loss_coef
def asdict(self):
dictionary = super().asdict()
dictionary['loss_coef'] = self.loss_coef * 1e3
dictionary['length_units'] = 'm'
return dictionary

View File

@@ -1,923 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.core.request
=================
This module contains path request functionality.
This functionality allows the user to provide a JSON request
file in accordance with a Yang model for requesting path
computations and returns path results in terms of path
and feasibility
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from sys import exit
from collections import namedtuple
from logging import getLogger, basicConfig, CRITICAL, DEBUG, INFO
from networkx import (dijkstra_path, NetworkXNoPath, all_simple_paths,shortest_path_length)
from networkx.utils import pairwise
from numpy import mean
from gnpy.core.service_sheet import convert_service_sheet, Request_element, Element
from gnpy.core.elements import Transceiver, Roadm, Edfa, Fused
from gnpy.core.utils import db2lin, lin2db
from gnpy.core.info import create_input_spectral_information, SpectralInformation, Channel, Power
from copy import copy, deepcopy
from csv import writer
from math import ceil
logger = getLogger(__name__)
RequestParams = namedtuple('RequestParams','request_id source destination trx_type'+
' trx_mode nodes_list loose_list spacing power nb_channel f_min f_max format baud_rate OSNR bit_rate roll_off tx_osnr min_spacing cost path_bandwidth')
DisjunctionParams = namedtuple('DisjunctionParams','disjunction_id relaxable link_diverse node_diverse disjunctions_req')
class Path_request:
def __init__(self, *args, **params):
params = RequestParams(**params)
self.request_id = params.request_id
self.source = params.source
self.destination = params.destination
self.tsp = params.trx_type
self.tsp_mode = params.trx_mode
self.baud_rate = params.baud_rate
self.nodes_list = params.nodes_list
self.loose_list = params.loose_list
self.spacing = params.spacing
self.power = params.power
self.nb_channel = params.nb_channel
self.f_min = params.f_min
self.f_max = params.f_max
self.format = params.format
self.OSNR = params.OSNR
self.bit_rate = params.bit_rate
self.roll_off = params.roll_off
self.tx_osnr = params.tx_osnr
self.min_spacing = params.min_spacing
self.cost = params.cost
self.path_bandwidth = params.path_bandwidth
def __str__(self):
return '\n\t'.join([ f'{type(self).__name__} {self.request_id}',
f'source: {self.source}',
f'destination: {self.destination}'])
def __repr__(self):
if self.baud_rate is not None:
temp = self.baud_rate * 1e-9
temp2 = self.bit_rate * 1e-9
else:
temp = self.baud_rate
temp2 = self.bit_rate
return '\n\t'.join([ f'{type(self).__name__} {self.request_id}',
f'source: \t{self.source}',
f'destination:\t{self.destination}',
f'trx type:\t{self.tsp}',
f'trx mode:\t{self.tsp_mode}',
f'baud_rate:\t{temp} Gbaud',
f'bit_rate:\t{temp2} Gb/s',
f'spacing:\t{self.spacing * 1e-9} GHz',
f'power: \t{round(lin2db(self.power)+30,2)} dBm',
f'nb channels: \t{self.nb_channel}',
f'path_bandwidth: \t{round(self.path_bandwidth * 1e-9,2)} Gbit/s',
f'nodes-list:\t{self.nodes_list}',
f'loose-list:\t{self.loose_list}'
'\n'])
class Disjunction:
def __init__(self, *args, **params):
params = DisjunctionParams(**params)
self.disjunction_id = params.disjunction_id
self.relaxable = params.relaxable
self.link_diverse = params.link_diverse
self.node_diverse = params.node_diverse
self.disjunctions_req = params.disjunctions_req
def __str__(self):
return '\n\t'.join([f'relaxable: {self.relaxable}',
f'link-diverse: {self.link_diverse}',
f'node-diverse: {self.node_diverse}',
f'request-id-numbers: {self.disjunctions_req}']
)
def __repr__(self):
return '\n\t'.join([ f'{type(self).__name__} {self.disjunction_id}',
f'relaxable: {self.relaxable}',
f'link-diverse: {self.link_diverse}',
f'node-diverse: {self.node_diverse}',
f'request-id-numbers: {self.disjunctions_req}'
'\n'])
class Result_element(Element):
def __init__(self,path_request,computed_path):
self.path_id = path_request.request_id
self.path_request = path_request
self.computed_path = computed_path
hop_type = []
if len(computed_path)>0 :
for e in computed_path :
if isinstance(e, Transceiver) :
hop_type.append(' - '.join([path_request.tsp,path_request.tsp_mode]))
else:
hop_type.append('not recorded')
else:
# TODO differentiate empty path in case not feasible because of tsp or not feasible because
# ther is no path connecting the nodes (whatever the tsp)
mode = 'not feasible with this transponder'
hop_type = ' - '.join([path_request.tsp,mode])
self.hop_type = hop_type
uid = property(lambda self: repr(self))
@property
def pathresult(self):
if not self.computed_path:
return {
'path-id': self.path_id,
'path-properties':{
'path-metric': [
{
'metric-type': 'SNR@bandwidth',
'accumulative-value': 'None'
},
{
'metric-type': 'SNR@0.1nm',
'accumulative-value': 'None'
},
{
'metric-type': 'OSNR@bandwidth',
'accumulative-value': 'None'
},
{
'metric-type': 'OSNR@0.1nm',
'accumulative-value': 'None'
},
{
'metric-type': 'reference_power',
'accumulative-value': self.path_request.power
},
{
'metric-type': 'path_bandwidth',
'accumulative-value': self.path_request.path_bandwidth
}
],
'path-srlgs': {
'usage': 'not used yet',
'values': 'not used yet'
},
'path-route-objects': [
{
'path-route-object': {
'index': 0,
'unnumbered-hop': {
'node-id': self.path_request.source,
'link-tp-id': self.path_request.source,
'hop-type': self.hop_type,
'direction': 'not used'
},
'label-hop': {
'te-label': {
'generic': 'not used yet',
'direction': 'not used yet'
}
}
}
},
{
'path-route-object': {
'index': 1,
'unnumbered-hop': {
'node-id': self.path_request.destination,
'link-tp-id': self.path_request.destination,
'hop-type': self.hop_type,
'direction': 'not used'
},
'label-hop': {
'te-label': {
'generic': 'not used yet',
'direction': 'not used yet'
}
}
}
}
]
}
}
else:
return {
'path-id': self.path_id,
'path-properties':{
'path-metric': [
{
'metric-type': 'SNR@bandwidth',
'accumulative-value': round(mean(self.computed_path[-1].snr),2)
},
{
'metric-type': 'SNR@0.1nm',
'accumulative-value': round(mean(self.computed_path[-1].snr+lin2db(self.path_request.baud_rate/12.5e9)),2)
},
{
'metric-type': 'OSNR@bandwidth',
'accumulative-value': round(mean(self.computed_path[-1].osnr_ase),2)
},
{
'metric-type': 'OSNR@0.1nm',
'accumulative-value': round(mean(self.computed_path[-1].osnr_ase_01nm),2)
},
{
'metric-type': 'reference_power',
'accumulative-value': self.path_request.power
},
{
'metric-type': 'path_bandwidth',
'accumulative-value': self.path_request.path_bandwidth
}
],
'path-srlgs': {
'usage': 'not used yet',
'values': 'not used yet'
},
'path-route-objects': [
{
'path-route-object': {
'index': self.computed_path.index(n),
'unnumbered-hop': {
'node-id': n.uid,
'link-tp-id': n.uid,
'hop-type': self.hop_type[self.computed_path.index(n)],
'direction': 'not used'
},
'label-hop': {
'te-label': {
'generic': 'not used yet',
'direction': 'not used yet'
}
}
}
} for n in self.computed_path
]
}
}
@property
def json(self):
return self.pathresult
def compute_constrained_path(network, req):
trx = [n for n in network.nodes() if isinstance(n, Transceiver)]
roadm = [n for n in network.nodes() if isinstance(n, Roadm)]
edfa = [n for n in network.nodes() if isinstance(n, Edfa)]
anytypenode = [n for n in network.nodes()]
source = next(el for el in trx if el.uid == req.source)
# This method ensures that the constraint can be satisfied without loops
# except when it is not possible : eg if constraints makes a loop
# It requires that the source, dest and nodes are correct (no error in the names)
destination = next(el for el in trx if el.uid == req.destination)
nodes_list = []
for n in req.nodes_list :
# for debug excel print(n)
nodes_list.append(next(el for el in anytypenode if el.uid == n))
# nodes_list contains at least the destination
if nodes_list is None :
msg = f'Request {req.request_id} problem in the constitution of nodes_list: should at least include destination'
logger.critical(msg)
exit()
if req.nodes_list[-1] != req.destination:
msg = f'Request {req.request_id} malformed list of nodes: last node should be destination trx'
logger.critical(msg)
exit()
if len(nodes_list) == 1 :
try :
total_path = dijkstra_path(network, source, destination, weight = 'weight')
# print('checking edges length is correct')
# print(shortest_path_length(network,source,destination))
# print(shortest_path_length(network,source,destination,weight ='weight'))
# s = total_path[0]
# for e in total_path[1:]:
# print(s.uid)
# print(network.get_edge_data(s,e))
# s = e
except NetworkXNoPath:
msg = f'\x1b[1;33;40m'+f'Request {req.request_id} could not find a path from {source.uid} to node : {destination.uid} in network topology'+ '\x1b[0m'
logger.critical(msg)
print(msg)
total_path = []
else :
all_simp_pths = list(all_simple_paths(network,source=source,\
target=destination, cutoff=120))
candidate = []
for p in all_simp_pths :
if ispart(nodes_list, p) :
# print(f'selection{[el.uid for el in p if el in roadm]}')
candidate.append(p)
# select the shortest path (in nb of hops) -> changed to shortest path in km length
if len(candidate)>0 :
# candidate.sort(key=lambda x: len(x))
candidate.sort(key=lambda x: sum(network.get_edge_data(x[i],x[i+1])['weight'] for i in range(len(x)-2)))
total_path = candidate[0]
else:
if req.loose_list[req.nodes_list.index(n)] == 'loose':
print(f'\x1b[1;33;40m'+f'Request {req.request_id} could not find a path crossing {nodes_list} in network topology'+ '\x1b[0m')
print(f'constraint ignored')
total_path = dijkstra_path(network, source, destination, weight = 'weight')
else:
msg = f'\x1b[1;33;40m'+f'Request {req.request_id} could not find a path crossing {nodes_list}.\nNo path computed'+ '\x1b[0m'
logger.critical(msg)
print(msg)
total_path = []
# obsolete method: this does not guaranty to avoid loops or correct results
# Here is the demonstration :
# 1 1
# eg a----b-----c
# |1 |0.5 |1
# e----f--h--g
# 1 0.5 0.5
# if I have to compute a to g with constraint f-c
# result will be a concatenation of: a-b-f and f-b-c and c-g
# which means a loop.
# if to avoid loops I iteratively suppress edges of the segmenst in the topo
# segment 1 = a-b-f
# 1
# eg a b-----c
# |1 |1
# e----f--h--g
# 1 0.5 0.5
# then
# segment 2 = f-h-g-c
# 1
# eg a b-----c
# |1
# e----f h g
# 1
# then there is no more path to g destination
#
#
# total_path = [source]
# for n in req.nodes_list:
# try :
# node = next(el for el in trx if el.uid == n)
# except StopIteration:
# try:
# node = next(el for el in anytypenode if el.uid == n)
# except StopIteration:
# try:
# # TODO this test is not giving good results: full name of the
# # amp is required to avoid ambiguity on the direction
# node = next(el for el in anytypenode
# if n in el.uid)
# except StopIteration:
# msg = f'could not find node : {n} in network topology: \
# not a trx, roadm, edfa, fiber or fused element'
# logger.critical(msg)
# raise ValueError(msg)
# # extend path list without repeating source -> skip first element in the list
# try:
# # to avoid looping back: use an alternate graph were current path edges and vertex are suppressed
# total_path.extend(dijkstra_path(network, source, node)[1:])
# source = node
# except NetworkXNoPath:
# if req.loose_list[req.nodes_list.index(n)] == 'loose':
# print(f'could not find a path from {source.uid} to loose node : {n} in network topology')
# print(f'node {n} is skipped')
# else:
# msg = f'could not find a path from {source.uid} to node : {n} in network topology'
# logger.critical(msg)
# print(msg)
# total_path = []
return total_path
def propagate(path, req, equipment):
si = create_input_spectral_information(
req.f_min, req.f_max, req.roll_off, req.baud_rate,
req.power, req.spacing)
for el in path:
si = el(si)
path[-1].update_snr(req.tx_osnr, equipment['Roadm']['default'].add_drop_osnr)
return path
def propagate2(path, req, equipment):
si = create_input_spectral_information(
req.f_min, req.f_max, req.roll_off, req.baud_rate,
req.power, req.spacing)
infos = {}
for el in path:
before_si = si
after_si = si = el(si)
infos[el] = before_si, after_si
path[-1].update_snr(req.tx_osnr, equipment['Roadm']['default'].add_drop_osnr)
return infos
def propagate_and_optimize_mode(path, req, equipment):
# if mode is unknown : loops on the modes starting from the highest baudrate fiting in the
# step 1: create an ordered list of modes based on baudrate
baudrate_to_explore = list(set([m['baud_rate'] for m in equipment['Transceiver'][req.tsp].mode
if float(m['min_spacing'])<= req.spacing]))
# TODO be carefull on limits cases if spacing very close to req spacing eg 50.001 50.000
baudrate_to_explore = sorted(baudrate_to_explore, reverse=True)
if baudrate_to_explore :
# at least 1 baudrate can be tested wrt spacing
for b in baudrate_to_explore :
modes_to_explore = [m for m in equipment['Transceiver'][req.tsp].mode
if m['baud_rate'] == b and float(m['min_spacing'])<= req.spacing]
modes_to_explore = sorted(modes_to_explore,
key = lambda x: x['bit_rate'], reverse=True)
# print(modes_to_explore)
# step2 : computes propagation for each baudrate: stop and select the first that passes
found_a_feasible_mode = False
# TODO : the case of roll of is not included: for now use SI one
# TODO : if the loop in mode optimization does not have a feasible path, then bugs
si = create_input_spectral_information(
req.f_min, req.f_max, equipment['SI']['default'].roll_off,
b, req.power, req.spacing)
for el in path:
si = el(si)
for m in modes_to_explore :
if path[-1].snr is not None:
path[-1].update_snr(m['tx_osnr'], equipment['Roadm']['default'].add_drop_osnr)
if round(min(path[-1].snr+lin2db(b/(12.5e9))),2) > m['OSNR'] :
found_a_feasible_mode = True
return path, m
else:
return [], None
# only get to this point if no baudrate/mode satisfies OSNR requirement
# returns the last propagated path and mode
msg = f'\tWarning! Request {req.request_id}: no mode satisfies path SNR requirement.\n'
print(msg)
logger.info(msg)
return [],None
else :
# no baudrate satisfying spacing
msg = f'\tWarning! Request {req.request_id}: no baudrate satisfies spacing requirement.\n'
print(msg)
logger.info(msg)
return [], None
def jsontocsv(json_data,equipment,fileout):
# read json path result file in accordance with:
# Yang model for requesting Path Computation
# draft-ietf-teas-yang-path-computation-01.txt.
# and write results in an CSV file
mywriter = writer(fileout)
mywriter.writerow(('path-id','source','destination','path_bandwidth','Pass?',\
'nb of tsp pairs','total cost','transponder-type','transponder-mode',\
'OSNR@0.1nm','SNR@0.1nm','SNR@bandwidth','baud rate (Gbaud)',\
'input power (dBm)','path'))
tspjsondata = equipment['Transceiver']
#print(tspjsondata)
for p in json_data['path']:
path_id = p['path-id']
source = p['path-properties']['path-route-objects'][0]\
['path-route-object']['unnumbered-hop']['node-id']
destination = p['path-properties']['path-route-objects'][-1]\
['path-route-object']['unnumbered-hop']['node-id']
# selects only roadm nodes
pth = ' | '.join([ e['path-route-object']['unnumbered-hop']['node-id']
for e in p['path-properties']['path-route-objects']
if e['path-route-object']['unnumbered-hop']['node-id'].startswith('roadm') or e['path-route-object']['unnumbered-hop']['node-id'].startswith('Edfa')])
[tsp,mode] = p['path-properties']['path-route-objects'][0]\
['path-route-object']['unnumbered-hop']['hop-type'].split(' - ')
# find the min acceptable OSNR, baud rate from the eqpt library based on tsp (tupe) and mode (format)
# loading equipment already tests the existence of tsp type and mode:
if mode !='not feasible with this transponder' :
[minosnr, baud_rate, bit_rate, cost] = next([m['OSNR'] , m['baud_rate'] , m['bit_rate'], m['cost']]
for m in equipment['Transceiver'][tsp].mode if m['format']==mode)
# else:
# [minosnr, baud_rate, bit_rate] = ['','','','']
output_snr = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'SNR@0.1nm')
output_snrbandwidth = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'SNR@bandwidth')
output_osnr = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'OSNR@0.1nm')
output_osnrbandwidth = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'OSNR@bandwidth')
power = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'reference_power')
path_bandwidth = next(e['accumulative-value']
for e in p['path-properties']['path-metric'] if e['metric-type'] == 'path_bandwidth')
if isinstance(output_snr, str):
isok = False
nb_tsp = 0
pthbdbw = round(path_bandwidth*1e-9,2)
rosnr = ''
rsnr = ''
rsnrb = ''
br = ''
pw = ''
total_cost = ''
else:
isok = output_snr >= minosnr
nb_tsp = ceil(path_bandwidth / bit_rate)
pthbdbw = round(path_bandwidth*1e-9,2)
rosnr = round(output_osnr,2)
rsnr = round(output_snr,2)
rsnrb = round(output_snrbandwidth,2)
br = round(baud_rate*1e-9,2)
pw = round(lin2db(power)+30,2)
total_cost = nb_tsp * cost
mywriter.writerow((path_id,
source,
destination,
pthbdbw,
isok,
nb_tsp,
total_cost,
tsp,
mode,
rosnr,
rsnr,
rsnrb,
br,
pw,
pth
))
def compute_path_dsjctn(network, equipment, pathreqlist, disjunctions_list):
# pathreqlist is a list of Path_request objects
# disjunctions_list a list of Disjunction objects
# given a network, a list of requests with the set of disjunction features between
# request, the function computes the set of path satisfying : first the disjunction
# constraint and second the routing constraint if the request include an explicit
# set of elements to pass through.
# the algorithm used allows to specify disjunction for demands not sharing source or
# destination.
# a request might be declared as disjoint from several requests
# it is a iterative process:
# first computes a list of all shortest path (this may add computation time)
# second elaborate the set of path solution for each synchronization vector
# third select only the candidates that satisfy all synchronization vectors they belong to
# fourth apply route constraints : remove candidate path that do not satisfy the constraint
# fifth select the first candidate among the set of candidates.
# the example network used in comments has been added to the set of data tests files
# define the list to be returned
path_res_list = []
# all disjctn must be computed at once together to avoid blocking
# 1 1
# eg a----b-----c
# |1 |0.5 |1
# e----f--h--g
# 1 0.5 0.5
# if I have to compute a to g and a to h
# I must not compute a-b-f-h-g, otherwise there is no disjoint path remaining for a to h
# instead I should list all most disjoint path and select the one that have the less
# number of commonalities
# \ path abfh aefh abcgh
# \___cost 2 2.5 3.5
# path| cost
# abfhg| 2.5 x x x
# abcg | 3 x x
# aefhg| 3 x x x
# from this table abcg and aefh have no common links and should be preferred
# even they are not the shortest paths
# build the list of pathreqlist elements not concerned by disjunction
global_disjunctions_list = [e for d in disjunctions_list for e in d.disjunctions_req ]
pathreqlist_simple = [e for e in pathreqlist if e.request_id not in global_disjunctions_list]
pathreqlist_disjt = [e for e in pathreqlist if e.request_id in global_disjunctions_list]
# use a mirror class to record path and the corresponding requests
class Pth:
def __init__(self, req, pth, simplepth):
self.req = req
self.pth = pth
self.simplepth = simplepth
# step 1
# for each remaining request compute a set of simple path
allpaths = {}
rqs = {}
simple_rqs = {}
simple_rqs_reversed = {}
for pathreq in pathreqlist_disjt :
all_simp_pths = list(all_simple_paths(network,\
source=next(el for el in network.nodes() if el.uid == pathreq.source),\
target=next(el for el in network.nodes() if el.uid == pathreq.destination),\
cutoff=80))
# sort them in km length instead of hop
# all_simp_pths = sorted(all_simp_pths, key=lambda path: len(path))
all_simp_pths = sorted(all_simp_pths, key=lambda \
x: sum(network.get_edge_data(x[i],x[i+1])['weight'] for i in range(len(x)-2)))
# reversed direction paths required to check disjunction on both direction
all_simp_pths_reversed = []
for pth in all_simp_pths:
all_simp_pths_reversed.append(find_reversed_path(pth,network))
rqs[pathreq.request_id] = all_simp_pths
temp =[]
for p in all_simp_pths :
# build a short list representing each roadm+direction with the first item
# start enumeration at 1 to avoid Trx in the list
s = [e.uid for i,e in enumerate(p[1:-1]) \
if (isinstance(e,Roadm) | (isinstance(p[i],Roadm) ))]
temp.append(s)
# id(s) is unique even if path is the same: two objects with same
# path have two different ids
allpaths[id(s)] = Pth(pathreq,p,s)
simple_rqs[pathreq.request_id] = temp
temp =[]
for p in all_simp_pths_reversed :
# build a short list representing each roadm+direction with the first item
# start enumeration at 1 to avoid Trx in the list
temp.append([e.uid for i,e in enumerate(p[1:-1]) \
if (isinstance(e,Roadm) | (isinstance(p[i],Roadm) ))] )
simple_rqs_reversed[pathreq.request_id] = temp
# step 2
# for each set of requests that need to be disjoint
# select the disjoint path combination
candidates = {}
for d in disjunctions_list :
dlist = d.disjunctions_req.copy()
# each line of dpath is one combination of path that satisfies disjunction
dpath = []
for i,p in enumerate(simple_rqs[dlist[0]]):
dpath.append([p])
# allpaths[id(p)].d_id = d.disjunction_id
# in each loop, dpath is updated with a path for rq that satisfies
# disjunction with each path in dpath
# for example, assume set of requests in the vector (disjunction_list) is {rq1,rq2, rq3}
# rq1 p1: abfhg
# p2: aefhg
# p3: abcg
# rq2 p8: bf
# rq3 p4: abcgh
# p6: aefh
# p7: abfh
# initiate with rq1
# dpath = [[p1]
# [p2]
# [p3]]
# after first loop:
# dpath = [[p1 p8]
# [p3 p8]]
# since p2 and p8 are not disjoint
# after second loop:
# dpath = [ p3 p8 p6 ]
# since p1 and p4 are not disjoint
# p1 and p7 are not disjoint
# p3 and p4 are not disjoint
# p3 and p7 are not disjoint
for e1 in dlist[1:] :
temp = []
for j,p1 in enumerate(simple_rqs[e1]):
# allpaths[id(p1)].d_id = d.disjunction_id
# can use index j in simple_rqs_reversed because index
# of direct and reversed paths have been kept identical
p1_reversed = simple_rqs_reversed[e1][j]
# print(p1_reversed)
# print('\n\n')
for k,c in enumerate(dpath) :
# print(f' c: \t{c}')
temp2 = c.copy()
all_disjoint = 0
for p in c :
all_disjoint += isdisjoint(p1,p)+ isdisjoint(p1_reversed,p)
if all_disjoint ==0:
temp2.append(p1)
temp.append(temp2)
# print(f' coucou {e1}: \t{temp}')
dpath = temp
# print(dpath)
candidates[d.disjunction_id] = dpath
# for i in disjunctions_list :
# print(f'\n{candidates[i.disjunction_id]}')
# step 3
# now for each request, select the path that satisfies all disjunctions
# path must be in candidates[id] for all concerned ids
# for example, assume set of sync vectors (disjunction groups) is
# s1 = {rq1 rq2} s2 = {rq1 rq3}
# candidate[s1] = [[p1 p8]
# [p3 p8]]
# candidate[s2] = [[p3 p6]]
# for rq1 p3 should be preferred
for pathreq in pathreqlist_disjt:
concerned_d_id = [d.disjunction_id for d in disjunctions_list if pathreq.request_id in d.disjunctions_req]
# for each set of solution, verify that the same path is used for the same request
candidate_paths = simple_rqs[pathreq.request_id]
# print('coucou')
# print(pathreq.request_id)
for p in candidate_paths :
iscandidate = 0
for sol in concerned_d_id :
test = 1
# for each solution test if p is part of the solution
# if yes, then p can remain a candidate
for i,m in enumerate(candidates[sol]) :
if p in m:
if allpaths[id(m[m.index(p)])].req.request_id == pathreq.request_id :
test = 0
break
iscandidate += test
if iscandidate != 0:
for l in concerned_d_id :
for m in candidates[l] :
if p in m :
candidates[l].remove(m)
# for i in disjunctions_list :
# print(i.disjunction_id)
# print(f'\n{candidates[i.disjunction_id]}')
# step 4 apply route constraints : remove candidate path that do not satisfy the constraint
# only in the case of disjounction: the simple path is processed in request.compute_constrained_path
# TODO : keep a version without the loose constraint
for d in disjunctions_list :
temp = []
for j,sol in enumerate(candidates[d.disjunction_id]) :
testispartok = True
for i,p in enumerate(sol) :
# print(f'test {allpaths[id(p)].req.request_id}')
# print(f'length of route {len(allpaths[id(p)].req.nodes_list)}')
if allpaths[id(p)].req.nodes_list :
# if p does not containt the ordered list node, remove sol from the candidate
# except if this was the last solution: then check if the constraint is loose or not
if not ispart(allpaths[id(p)].req.nodes_list, p) :
# print(f'nb of solutions {len(temp)}')
if j < len(candidates[d.disjunction_id])-1 :
msg = f'removing {sol}'
logger.info(msg)
testispartok = False
#break
else:
if 'loose' in allpaths[id(p)].req.loose_list:
logger.info(f'Could not apply route constraint'+
f'{allpaths[id(p)].req.nodes_list} on request {allpaths[id(p)].req.request_id}')
else :
logger.info(f'removing last solution from candidate paths\n{sol}')
testispartok = False
if testispartok :
temp.append(sol)
candidates[d.disjunction_id] = temp
# step 5 select the first combination that works
pathreslist_disjoint = {}
for d in disjunctions_list :
test_sol = True
while test_sol:
# print('coucou')
if candidates[d.disjunction_id] :
for p in candidates[d.disjunction_id][0]:
if allpaths[id(p)].req in pathreqlist_disjt:
# print(f'selected path :{p} for req {allpaths[id(p)].req.request_id}')
pathreslist_disjoint[allpaths[id(p)].req] = allpaths[id(p)].pth
pathreqlist_disjt.remove(allpaths[id(p)].req)
candidates = remove_candidate(candidates, allpaths, allpaths[id(p)].req, p)
test_sol = False
else:
msg = f'No disjoint path found with added constraint'
logger.critical(msg)
print(f'{msg}\nComputation stopped.')
# TODO in this case: replay step 5 with the candidate without constraints
exit()
# for i in disjunctions_list :
# print(i.disjunction_id)
# print(f'\n{candidates[i.disjunction_id]}')
# list the results in the same order as initial pathreqlist
for req in pathreqlist :
req.nodes_list.append(req.destination)
# we assume that the destination is a strict constraint
req.loose_list.append('strict')
if req in pathreqlist_simple:
path_res_list.append(compute_constrained_path(network, req))
else:
path_res_list.append(pathreslist_disjoint[req])
return path_res_list
def isdisjoint(p1,p2) :
# returns 0 if disjoint
edge1 = list(pairwise(p1))
edge2 = list(pairwise(p2))
for e in edge1 :
if e in edge2 :
return 1
return 0
def find_reversed_path(p,network) :
# select of intermediate roadms and find the path between them
# note that this function may not give an exact result in case of multiple
# links between two adjacent nodes.
# TODO add some indication on elements to indicate from which other they
# are the reversed direction
reversed_roadm_path = list(reversed([e for e in p if isinstance (e,Roadm)]))
source = p[-1]
destination = p[0]
total_path = [source]
for node in reversed_roadm_path :
total_path.extend(dijkstra_path(network, source, node, weight = 'weight')[1:])
source = node
total_path.append(destination)
return total_path
def ispart(a,b) :
# the functions takes two paths a and b and retrns True
# if all a elements are part of b and in the same order
j = 0
for i, el in enumerate(a):
if el in b :
if b.index(el) >= j :
j = b.index(el)
else:
return False
else:
return False
return True
def remove_candidate(candidates, allpaths, rq, pth) :
# print(f'coucou {rq.request_id}')
for key, candidate in candidates.items() :
temp = candidate.copy()
for i,sol in enumerate(candidate) :
for p in sol :
if allpaths[id(p)].req.request_id == rq.request_id :
if id(p) != id(pth) :
temp.remove(sol)
break
candidates[key] = temp
return candidates
def compare_reqs(req1,req2,disjlist) :
dis1 = [d for d in disjlist if req1.request_id in d.disjunctions_req]
dis2 = [d for d in disjlist if req2.request_id in d.disjunctions_req]
same_disj = False
if dis1 and dis2 :
temp1 = []
for d in dis1:
temp1.extend(d.disjunctions_req)
temp1.remove(req1.request_id)
temp2 = []
for d in dis2:
temp2.extend(d.disjunctions_req)
temp2.remove(req2.request_id)
if set(temp1) == set(temp2) :
same_disj = True
elif not dis2 and not dis1:
same_disj = True
if req1.source == req2.source and \
req1.destination == req2.destination and \
req1.tsp == req2.tsp and \
req1.tsp_mode == req2.tsp_mode and \
req1.baud_rate == req2.baud_rate and \
req1.nodes_list == req2.nodes_list and \
req1.loose_list == req2.loose_list and \
req1.spacing == req2.spacing and \
req1.power == req2.power and \
req1.nb_channel == req2.nb_channel and \
req1.f_min == req2.f_min and \
req1.f_max == req2.f_max and \
req1.format == req2.format and \
req1.OSNR == req2.OSNR and \
req1.roll_off == req2.roll_off and \
same_disj :
return True
else:
return False
def requests_aggregation(pathreqlist,disjlist) :
# this function aggregates requests so that if several requests
# exist between same source and destination and with same transponder type
# todo maybe add conditions on mode ??, spacing ...
# currently if undefined takes the default values
local_list = pathreqlist.copy()
for req in pathreqlist:
for r in local_list :
if req.request_id != r.request_id and compare_reqs(req, r, disjlist):
# aggregate
r.path_bandwidth += req.path_bandwidth
temp_r_id = r.request_id
r.request_id = ' | '.join((r.request_id,req.request_id))
# remove request from list
local_list.remove(req)
# todo change also disjunction req with new demand
for d in disjlist :
if req.request_id in d.disjunctions_req :
d.disjunctions_req.remove(req.request_id)
d.disjunctions_req.append(r.request_id)
for d in disjlist :
if temp_r_id in d.disjunctions_req :
disjlist.remove(d)
break
return local_list, disjlist

View File

@@ -1,162 +1,27 @@
import numpy as np
from operator import attrgetter
from collections import namedtuple
from logging import getLogger
import scipy.constants as ph
from scipy.integrate import solve_bvp
from scipy.integrate import cumtrapz
from scipy.interpolate import interp1d
from scipy.optimize import OptimizeResult
from math import isclose
from gnpy.core.utils import db2lin
from gnpy.core.utils import db2lin, lin2db
from gnpy.core.exceptions import EquipmentConfigError
logger = getLogger(__name__)
class RamanParams():
def __init__(self, params):
self._flag_raman = params['flag_raman']
self._space_resolution = params['space_resolution']
self._tolerance = params['tolerance']
@property
def flag_raman(self):
return self._flag_raman
@property
def space_resolution(self):
return self._space_resolution
@property
def tolerance(self):
return self._tolerance
class NLIParams():
def __init__(self, params):
self._nli_method_name = params['nli_method_name']
self._wdm_grid_size = params['wdm_grid_size']
self._dispersion_tolerance = params['dispersion_tolerance']
self._phase_shift_tollerance = params['phase_shift_tollerance']
self._f_cut_resolution = None
self._f_pump_resolution = None
@property
def nli_method_name(self):
return self._nli_method_name
@property
def wdm_grid_size(self):
return self._wdm_grid_size
@property
def dispersion_tolerance(self):
return self._dispersion_tolerance
@property
def phase_shift_tollerance(self):
return self._phase_shift_tollerance
@property
def f_cut_resolution(self):
return self._f_cut_resolution
@f_cut_resolution.setter
def f_cut_resolution(self, f_cut_resolution):
self._f_cut_resolution = f_cut_resolution
@property
def f_pump_resolution(self):
return self._f_pump_resolution
@f_pump_resolution.setter
def f_pump_resolution(self, f_pump_resolution):
self._f_pump_resolution = f_pump_resolution
class SimParams():
def __init__(self, params):
self._raman_computed_channels = params['raman_computed_channels']
self._raman_params = RamanParams(params=params['raman_parameters'])
self._nli_params = NLIParams(params=params['nli_parameters'])
@property
def raman_computed_channels(self):
return self._raman_computed_channels
@property
def raman_params(self):
return self._raman_params
@property
def nli_params(self):
return self._nli_params
class FiberParams():
def __init__(self, fiber):
self._loss_coef = 2 * fiber.dbkm_2_lin()[1]
self._length = fiber.length
self._gamma = fiber.gamma
self._beta2 = fiber.beta2()
self._beta3 = fiber.beta3 if hasattr(fiber, 'beta3') else 0
self._f_ref_beta = fiber.f_ref_beta if hasattr(fiber, 'f_ref_beta') else 0
self._raman_efficiency = fiber.params.raman_efficiency
self._temperature = fiber.operational['temperature']
@property
def loss_coef(self):
return self._loss_coef
@property
def length(self):
return self._length
@property
def gamma(self):
return self._gamma
@property
def beta2(self):
return self._beta2
@property
def beta3(self):
return self._beta3
@property
def f_ref_beta(self):
return self._f_ref_beta
@property
def raman_efficiency(self):
return self._raman_efficiency
@property
def temperature(self):
return self._temperature
def alpha0(self, f_ref=193.5e12):
""" It returns the zero element of the series expansion of attenuation coefficient alpha(f) in the
reference frequency f_ref
:param f_ref: reference frequency of series expansion [Hz]
:return: alpha0: power attenuation coefficient in f_ref [Neper/m]
"""
if not hasattr(self.loss_coef, 'alpha_power'):
alpha0 = self.loss_coef
else:
alpha_interp = interp1d(self.loss_coef['frequency'],
self.loss_coef['alpha_power'])
alpha0 = alpha_interp(f_ref)
return alpha0
pump = namedtuple('RamanPump', 'power frequency propagation_direction')
def propagate_raman_fiber(fiber, *carriers):
sim_params = fiber.sim_params
raman_params = fiber.sim_params.raman_params
nli_params = fiber.sim_params.nli_params
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
raman_params = sim_params.raman_params
nli_params = sim_params.nli_params
# apply input attenuation to carriers
attenuation_in = db2lin(fiber.con_in + fiber.att_in)
attenuation_in = db2lin(fiber.params.con_in + fiber.params.att_in)
chan = []
for carrier in carriers:
pwr = carrier.power
@@ -166,36 +31,32 @@ def propagate_raman_fiber(fiber, *carriers):
carrier = carrier._replace(power=pwr)
chan.append(carrier)
carriers = tuple(f for f in chan)
fiber_params = FiberParams(fiber)
# evaluate fiber attenuation involving also SRS if required by sim_params
if 'raman_pumps' in fiber.operational:
raman_pumps = tuple(pump(p['power'], p['frequency'], p['propagation_direction'])
for p in fiber.operational['raman_pumps'])
else:
raman_pumps = None
raman_solver = RamanSolver(raman_params=raman_params, fiber_params=fiber_params)
stimulated_raman_scattering = raman_solver.stimulated_raman_scattering(carriers=carriers,
raman_pumps=raman_pumps)
raman_solver = fiber.raman_solver
raman_solver.carriers = carriers
raman_solver.raman_pumps = fiber.raman_pumps
stimulated_raman_scattering = raman_solver.stimulated_raman_scattering
fiber_attenuation = (stimulated_raman_scattering.rho[:, -1])**-2
if not raman_params.flag_raman:
fiber_attenuation = tuple(fiber.lin_attenuation for _ in carriers)
fiber_attenuation = tuple(fiber.params.lin_attenuation for _ in carriers)
# evaluate Raman ASE noise if required by sim_params and if raman pumps are present
if raman_params.flag_raman and raman_pumps:
if raman_params.flag_raman and fiber.raman_pumps:
raman_ase = raman_solver.spontaneous_raman_scattering.power[:, -1]
else:
raman_ase = tuple(0 for _ in carriers)
# evaluate nli and propagate in fiber
attenuation_out = db2lin(fiber.con_out)
nli_solver = NliSolver(nli_params=nli_params, fiber_params=fiber_params)
attenuation_out = db2lin(fiber.params.con_out)
nli_solver = fiber.nli_solver
nli_solver.stimulated_raman_scattering = stimulated_raman_scattering
nli_frequencies = []
computed_nli = []
for carrier in (c for c in carriers if c.channel_number in sim_params.raman_computed_channels):
resolution_param = frequency_resolution(carrier, carriers, sim_params, fiber_params)
for carrier in (c for c in carriers if c.channel_number in sim_params.nli_params.computed_channels):
resolution_param = frequency_resolution(carrier, carriers, sim_params, fiber)
f_cut_resolution, f_pump_resolution, _, _ = resolution_param
nli_params.f_cut_resolution = f_cut_resolution
nli_params.f_pump_resolution = f_pump_resolution
@@ -206,13 +67,14 @@ def propagate_raman_fiber(fiber, *carriers):
for carrier, attenuation, rmn_ase in zip(carriers, fiber_attenuation, raman_ase):
carrier_nli = np.interp(carrier.frequency, nli_frequencies, computed_nli)
pwr = carrier.power
pwr = pwr._replace(signal=pwr.signal/attenuation/attenuation_out,
nli=(pwr.nli+carrier_nli)/attenuation/attenuation_out,
ase=((pwr.ase/attenuation)+rmn_ase)/attenuation_out)
pwr = pwr._replace(signal=pwr.signal / attenuation / attenuation_out,
nli=(pwr.nli + carrier_nli) / attenuation / attenuation_out,
ase=((pwr.ase / attenuation) + rmn_ase) / attenuation_out)
new_carriers.append(carrier._replace(power=pwr))
return new_carriers
def frequency_resolution(carrier, carriers, sim_params, fiber_params):
def frequency_resolution(carrier, carriers, sim_params, fiber):
def _get_freq_res_k_phi(delta_count, grid_size, alpha0, delta_z, beta2, k_tol, phi_tol):
res_phi = _get_freq_res_phase_rotation(delta_count, grid_size, delta_z, beta2, phi_tol)
res_k = _get_freq_res_dispersion_attenuation(delta_count, grid_size, alpha0, beta2, k_tol)
@@ -228,10 +90,10 @@ def frequency_resolution(carrier, carriers, sim_params, fiber_params):
grid_size = sim_params.nli_params.wdm_grid_size
delta_z = sim_params.raman_params.space_resolution
alpha0 = fiber_params.alpha0()
beta2 = fiber_params.beta2
alpha0 = fiber.alpha0()
beta2 = fiber.params.beta2
k_tol = sim_params.nli_params.dispersion_tolerance
phi_tol = sim_params.nli_params.phase_shift_tollerance
phi_tol = sim_params.nli_params.phase_shift_tolerance
f_pump_resolution, method_f_pump, res_dict_pump = \
_get_freq_res_k_phi(0, grid_size, alpha0, delta_z, beta2, k_tol, phi_tol)
f_cut_resolution = {}
@@ -247,6 +109,7 @@ def frequency_resolution(carrier, carriers, sim_params, fiber_params):
res_dict_cut[delta_number] = res_dict
return [f_cut_resolution, f_pump_resolution, (method_f_cut, method_f_pump), (res_dict_cut, res_dict_pump)]
def raised_cosine_comb(f, *carriers):
""" Returns an array storing the PSD of a WDM comb of raised cosine shaped
channels at the input frequencies defined in array f
@@ -270,30 +133,59 @@ def raised_cosine_comb(f, *carriers):
np.where(tf > 0, 1., 0.) * np.where(np.abs(ff) <= stopband, 1., 0.)) + psd
return psd
class Simulation:
_shared_dict = {}
def __init__(self):
if type(self) == Simulation:
raise NotImplementedError('Simulation cannot be instatiated')
@classmethod
def set_params(cls, sim_params):
cls._shared_dict['sim_params'] = sim_params
@classmethod
def get_simulation(cls):
self = cls.__new__(cls)
return self
@property
def sim_params(self):
return self._shared_dict['sim_params']
class SpontaneousRamanScattering:
def __init__(self, frequency, z, power):
self.frequency = frequency
self.z = z
self.power = power
class StimulatedRamanScattering:
def __init__(self, frequency, z, rho, power):
self.frequency = frequency
self.z = z
self.rho = rho
self.power = power
class RamanSolver:
def __init__(self, raman_params=None, fiber_params=None):
""" Initialize the fiber object with its physical parameters
:param length: fiber length in m.
:param alphap: fiber power attenuation coefficient vs frequency in 1/m. numpy array
:param freq_alpha: frequency axis of alphap in Hz. numpy array
:param cr_raman: Raman efficiency vs frequency offset in 1/W/m. numpy array
:param freq_cr: reference frequency offset axis for cr_raman. numpy array
:param raman_params: namedtuple containing the solver parameters (optional).
def __init__(self, fiber=None):
""" Initialize the Raman solver object.
:param fiber: instance of elements.py/Fiber.
:param carriers: tuple of carrier objects
:param raman_pumps: tuple containing pumps characteristics
"""
self.fiber_params = fiber_params
self.raman_params = raman_params
self._fiber = fiber
self._carriers = None
self._raman_pumps = None
self._stimulated_raman_scattering = None
self._spontaneous_raman_scattering = None
@property
def fiber_params(self):
return self._fiber_params
@fiber_params.setter
def fiber_params(self, fiber_params):
self._stimulated_raman_scattering = None
self._fiber_params = fiber_params
def fiber(self):
return self._fiber
@property
def carriers(self):
@@ -301,11 +193,8 @@ class RamanSolver:
@carriers.setter
def carriers(self, carriers):
"""
:param carriers: tuple of namedtuples containing information about carriers
:return:
"""
self._carriers = carriers
self._spontaneous_raman_scattering = None
self._stimulated_raman_scattering = None
@property
@@ -318,62 +207,43 @@ class RamanSolver:
self._stimulated_raman_scattering = None
@property
def raman_params(self):
return self._raman_params
@raman_params.setter
def raman_params(self, raman_params):
"""
:param raman_params: namedtuple containing the solver parameters (optional).
:return:
"""
self._raman_params = raman_params
self._stimulated_raman_scattering = None
self._spontaneous_raman_scattering = None
def stimulated_raman_scattering(self):
if self._stimulated_raman_scattering is None:
self.calculate_stimulated_raman_scattering(self.carriers, self.raman_pumps)
return self._stimulated_raman_scattering
@property
def spontaneous_raman_scattering(self):
if self._spontaneous_raman_scattering is None:
# SET STUFF
loss_coef = self.fiber_params.loss_coef
raman_efficiency = self.fiber_params.raman_efficiency
temperature = self.fiber_params.temperature
carriers = self.carriers
raman_pumps = self.raman_pumps
logger.debug('Start computing fiber Spontaneous Raman Scattering')
power_spectrum, freq_array, prop_direct, bn_array = self._compute_power_spectrum(carriers, raman_pumps)
if not hasattr(loss_coef, 'alpha_power'):
alphap_fiber = loss_coef * np.ones(freq_array.shape)
else:
interp_alphap = interp1d(loss_coef['frequency'], loss_coef['alpha_power'])
alphap_fiber = interp_alphap(freq_array)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
# z propagation axis
z_array = self._stimulated_raman_scattering.z
ase_bc = np.zeros(freq_array.shape)
# calculate ase power
spontaneous_raman_scattering = self._int_spontaneous_raman(z_array, self._stimulated_raman_scattering.power,
alphap_fiber, freq_array, cr, freq_diff, ase_bc,
bn_array, temperature)
setattr(spontaneous_raman_scattering, 'frequency', freq_array)
setattr(spontaneous_raman_scattering, 'z', z_array)
setattr(spontaneous_raman_scattering, 'power', spontaneous_raman_scattering.x)
delattr(spontaneous_raman_scattering, 'x')
logger.debug(spontaneous_raman_scattering.message)
self._spontaneous_raman_scattering = spontaneous_raman_scattering
self.calculate_spontaneous_raman_scattering(self.carriers, self.raman_pumps)
return self._spontaneous_raman_scattering
def calculate_spontaneous_raman_scattering(self, carriers, raman_pumps):
raman_efficiency = self.fiber.params.raman_efficiency
temperature = self.fiber.operational['temperature']
logger.debug('Start computing fiber Spontaneous Raman Scattering')
power_spectrum, freq_array, prop_direct, bn_array = self._compute_power_spectrum(carriers, raman_pumps)
alphap_fiber = self.fiber.alpha(freq_array)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
# z propagation axis
z_array = self.stimulated_raman_scattering.z
ase_bc = np.zeros(freq_array.shape)
# calculate ase power
int_spontaneous_raman = self._int_spontaneous_raman(z_array, self._stimulated_raman_scattering.power,
alphap_fiber, freq_array, cr, freq_diff, ase_bc,
bn_array, temperature)
spontaneous_raman_scattering = SpontaneousRamanScattering(freq_array, z_array, int_spontaneous_raman.x)
logger.debug("Spontaneous Raman Scattering evaluated successfully")
self._spontaneous_raman_scattering = spontaneous_raman_scattering
@staticmethod
def _compute_power_spectrum(carriers, raman_pumps=None):
"""
@@ -412,10 +282,14 @@ class RamanSolver:
return pow_array, f_array, propagation_direction, noise_bandwidth_array
def _int_spontaneous_raman(self, z_array, raman_matrix, alphap_fiber, freq_array, cr_raman_matrix, freq_diff, ase_bc, bn_array, temperature):
def _int_spontaneous_raman(self, z_array, raman_matrix, alphap_fiber, freq_array,
cr_raman_matrix, freq_diff, ase_bc, bn_array, temperature):
spontaneous_raman_scattering = OptimizeResult()
dx = self.raman_params.space_resolution
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
dx = sim_params.raman_params.space_resolution
h = ph.value('Planck constant')
kb = ph.value('Boltzmann constant')
@@ -425,85 +299,73 @@ class RamanSolver:
for f_ind, f_ase in enumerate(freq_array):
cr_raman = cr_raman_matrix[f_ind, :]
vibrational_loss = f_ase / freq_array[:f_ind]
eta = 1/(np.exp((h*freq_diff[f_ind, f_ind+1:])/(kb*temperature)) - 1)
eta = 1 / (np.exp((h * freq_diff[f_ind, f_ind + 1:]) / (kb * temperature)) - 1)
int_fiber_loss = -alphap_fiber[f_ind] * z_array
int_raman_loss = np.sum((cr_raman[:f_ind] * vibrational_loss * int_pump[:f_ind, :].transpose()).transpose(), axis=0)
int_raman_loss = np.sum((cr_raman[:f_ind] * vibrational_loss * int_pump[:f_ind, :].transpose()).transpose(),
axis=0)
int_raman_gain = np.sum((cr_raman[f_ind + 1:] * int_pump[f_ind + 1:, :].transpose()).transpose(), axis=0)
int_gain_loss = int_fiber_loss + int_raman_gain + int_raman_loss
new_ase = np.sum((cr_raman[f_ind+1:] * (1 + eta) * raman_matrix[f_ind+1:, :].transpose()).transpose() * h * f_ase * bn_array[f_ind], axis=0)
new_ase = np.sum((cr_raman[f_ind + 1:] * (1 + eta) * raman_matrix[f_ind + 1:, :].transpose()).transpose()
* h * f_ase * bn_array[f_ind], axis=0)
bc_evolution = ase_bc[f_ind] * np.exp(int_gain_loss)
ase_evolution = np.exp(int_gain_loss) * cumtrapz(new_ase*np.exp(-int_gain_loss), z_array, dx=dx, initial=0)
ase_evolution = np.exp(int_gain_loss) * cumtrapz(new_ase *
np.exp(-int_gain_loss), z_array, dx=dx, initial=0)
power_ase[f_ind, :] = bc_evolution + ase_evolution
spontaneous_raman_scattering.x = 2 * power_ase
spontaneous_raman_scattering.success = True
spontaneous_raman_scattering.message = "Spontaneous Raman Scattering evaluated successfully"
return spontaneous_raman_scattering
def stimulated_raman_scattering(self, carriers, raman_pumps=None):
""" Returns stimulated Raman scattering solution including
def calculate_stimulated_raman_scattering(self, carriers, raman_pumps):
""" Returns stimulated Raman scattering solution including
fiber gain/loss profile.
:return: self._stimulated_raman_scattering: the SRS problem solution.
scipy.interpolate.PPoly instance
:return: None
"""
# fiber parameters
fiber_length = self.fiber.params.length
raman_efficiency = self.fiber.params.raman_efficiency
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
if self._stimulated_raman_scattering is None:
# fiber parameters
fiber_length = self.fiber_params.length
loss_coef = self.fiber_params.loss_coef
if self.raman_params.flag_raman:
raman_efficiency = self.fiber_params.raman_efficiency
else:
raman_efficiency = self.fiber_params.raman_efficiency
raman_efficiency['cr'] = np.array(raman_efficiency['cr']) * 0
# raman solver parameters
z_resolution = self.raman_params.space_resolution
tolerance = self.raman_params.tolerance
if not sim_params.raman_params.flag_raman:
raman_efficiency['cr'] = np.zeros(len(raman_efficiency['cr']))
# raman solver parameters
z_resolution = sim_params.raman_params.space_resolution
tolerance = sim_params.raman_params.tolerance
logger.debug('Start computing fiber Stimulated Raman Scattering')
logger.debug('Start computing fiber Stimulated Raman Scattering')
power_spectrum, freq_array, prop_direct, _ = self._compute_power_spectrum(carriers, raman_pumps)
power_spectrum, freq_array, prop_direct, _ = self._compute_power_spectrum(carriers, raman_pumps)
if not hasattr(loss_coef, 'alpha_power'):
alphap_fiber = loss_coef * np.ones(freq_array.shape)
else:
interp_alphap = interp1d(loss_coef['frequency'], loss_coef['alpha_power'])
alphap_fiber = interp_alphap(freq_array)
alphap_fiber = self.fiber.alpha(freq_array)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
freq_diff = abs(freq_array - np.reshape(freq_array, (len(freq_array), 1)))
interp_cr = interp1d(raman_efficiency['frequency_offset'], raman_efficiency['cr'])
cr = interp_cr(freq_diff)
# z propagation axis
z = np.arange(0, fiber_length+1, z_resolution)
# z propagation axis
z = np.arange(0, fiber_length + 1, z_resolution)
ode_function = lambda z, p: self._ode_stimulated_raman(z, p, alphap_fiber, freq_array, cr, prop_direct)
boundary_residual = lambda ya, yb: self._residuals_stimulated_raman(ya, yb, power_spectrum, prop_direct)
initial_guess_conditions = self._initial_guess_stimulated_raman(z, power_spectrum, alphap_fiber, prop_direct)
def ode_function(z, p):
return self._ode_stimulated_raman(z, p, alphap_fiber, freq_array, cr, prop_direct)
# ODE SOLVER
stimulated_raman_scattering = solve_bvp(ode_function, boundary_residual, z, initial_guess_conditions, tol=tolerance)
def boundary_residual(ya, yb):
return self._residuals_stimulated_raman(ya, yb, power_spectrum, prop_direct)
rho = (stimulated_raman_scattering.y.transpose() / power_spectrum).transpose()
rho = np.sqrt(rho) # From power attenuation to field attenuation
setattr(stimulated_raman_scattering, 'frequency', freq_array)
setattr(stimulated_raman_scattering, 'z', stimulated_raman_scattering.x)
setattr(stimulated_raman_scattering, 'rho', rho)
setattr(stimulated_raman_scattering, 'power', stimulated_raman_scattering.y)
delattr(stimulated_raman_scattering, 'x')
delattr(stimulated_raman_scattering, 'y')
initial_guess_conditions = self._initial_guess_stimulated_raman(z, power_spectrum, alphap_fiber, prop_direct)
self.carriers = carriers
self.raman_pumps = raman_pumps
self._stimulated_raman_scattering = stimulated_raman_scattering
# ODE SOLVER
bvp_solution = solve_bvp(ode_function, boundary_residual, z, initial_guess_conditions, tol=tolerance)
return self._stimulated_raman_scattering
rho = (bvp_solution.y.transpose() / power_spectrum).transpose()
rho = np.sqrt(rho) # From power attenuation to field attenuation
stimulated_raman_scattering = StimulatedRamanScattering(freq_array, bvp_solution.x, rho, bvp_solution.y)
self._stimulated_raman_scattering = stimulated_raman_scattering
def _residuals_stimulated_raman(self, ya, yb, power_spectrum, prop_direct):
@@ -520,11 +382,14 @@ class RamanSolver:
def _initial_guess_stimulated_raman(self, z, power_spectrum, alphap_fiber, prop_direct):
""" Computes the initial guess knowing the boundary conditions
:param z: patial axis [m]. numpy array
:param power_spectrum: power in each frequency slice [W]. Frequency axis is defined by freq_array. numpy array
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m]. Frequency defined by freq_array. numpy array
:param power_spectrum: power in each frequency slice [W].
Frequency axis is defined by freq_array. numpy array
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m].
Frequency defined by freq_array. numpy array
:param prop_direct: indicates the propagation direction of each power slice in power_spectrum:
+1 for forward propagation and -1 for backward propagation. Frequency defined by freq_array. numpy array
:return: power_guess: guess on the initial conditions [W]. The first ndarray index identifies the frequency slice,
:return: power_guess: guess on the initial conditions [W].
The first ndarray index identifies the frequency slice,
the second ndarray index identifies the step in z. ndarray
"""
@@ -538,14 +403,19 @@ class RamanSolver:
return power_guess
def _ode_stimulated_raman(self, z, power_spectrum, alphap_fiber, freq_array, cr_raman_matrix, prop_direct):
""" Aim of ode_raman is to implement the set of ordinary differential equations (ODEs) describing the Raman effect.
""" Aim of ode_raman is to implement the set of ordinary differential equations (ODEs)
describing the Raman effect.
:param z: spatial axis (unused).
:param power_spectrum: power in each frequency slice [W]. Frequency axis is defined by freq_array. numpy array. Size n
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m]. Frequency defined by freq_array. numpy array. Size n
:param power_spectrum: power in each frequency slice [W].
Frequency axis is defined by freq_array. numpy array. Size n
:param alphap_fiber: frequency dependent fiber attenuation of signal power [1/m].
Frequency defined by freq_array. numpy array. Size n
:param freq_array: reference frequency axis [Hz]. numpy array. Size n
:param cr_raman: Cr(f) Raman gain efficiency variation in frequency [1/W/m]. Frequency defined by freq_array. numpy ndarray. Size nxn
:param cr_raman: Cr(f) Raman gain efficiency variation in frequency [1/W/m].
Frequency defined by freq_array. numpy ndarray. Size nxn
:param prop_direct: indicates the propagation direction of each power slice in power_spectrum:
+1 for forward propagation and -1 for backward propagation. Frequency defined by freq_array. numpy array. Size n
+1 for forward propagation and -1 for backward propagation.
Frequency defined by freq_array. numpy array. Size n
:return: dP/dz: the power variation in dz [W/m]. numpy array. Size n
"""
@@ -555,7 +425,7 @@ class RamanSolver:
vibrational_loss = freq_array[f_ind] / freq_array[:f_ind]
for z_ind, power_sample in enumerate(power):
raman_gain = np.sum(cr_raman[f_ind+1:] * power_spectrum[f_ind+1:, z_ind])
raman_gain = np.sum(cr_raman[f_ind + 1:] * power_spectrum[f_ind + 1:, z_ind])
raman_loss = np.sum(vibrational_loss * cr_raman[:f_ind] * power_spectrum[:f_ind, z_ind])
dpdz_element = prop_direct[f_ind] * (-alphap_fiber[f_ind] + raman_gain - raman_loss) * power_sample
@@ -563,28 +433,25 @@ class RamanSolver:
return np.vstack(dpdz)
class NliSolver:
""" This class implements the NLI models.
Model and method can be specified in `self.nli_params.method`.
Model and method can be specified in `sim_params.nli_params.method`.
List of implemented methods:
'gn_model_analytic': brute force triple integral solution
'ggn_spectrally_separated_xpm_spm': XPM plus SPM
"""
def __init__(self, nli_params=None, fiber_params=None):
""" Initialize the fiber object with its physical parameters
def __init__(self, fiber=None):
""" Initialize the Nli solver object.
:param fiber: instance of elements.py/Fiber.
"""
self.fiber_params = fiber_params
self.nli_params = nli_params
self.stimulated_raman_scattering = None
self._fiber = fiber
self._stimulated_raman_scattering = None
@property
def fiber_params(self):
return self._fiber_params
@fiber_params.setter
def fiber_params(self, fiber_params):
self._fiber_params = fiber_params
def fiber(self):
return self._fiber
@property
def stimulated_raman_scattering(self):
@@ -594,28 +461,19 @@ class NliSolver:
def stimulated_raman_scattering(self, stimulated_raman_scattering):
self._stimulated_raman_scattering = stimulated_raman_scattering
@property
def nli_params(self):
return self._nli_params
@nli_params.setter
def nli_params(self, nli_params):
"""
:param model_params: namedtuple containing the parameters used to compute the NLI.
"""
self._nli_params = nli_params
def compute_nli(self, carrier, *carriers):
""" Compute NLI power generated by the WDM comb `*carriers` on the channel under test `carrier`
at the end of the fiber span.
"""
if 'gn_model_analytic' == self.nli_params.nli_method_name.lower():
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
if 'gn_model_analytic' == sim_params.nli_params.nli_method_name.lower():
carrier_nli = self._gn_analytic(carrier, *carriers)
elif 'ggn_spectrally_separated' in self.nli_params.nli_method_name.lower():
elif 'ggn_spectrally_separated' in sim_params.nli_params.nli_method_name.lower():
eta_matrix = self._compute_eta_matrix(carrier, *carriers)
carrier_nli = self._carrier_nli_from_eta_matrix(eta_matrix, carrier, *carriers)
else:
raise ValueError(f'Method {self.nli_params.method_nli} not implemented.')
raise ValueError(f'Method {sim_params.nli_params.method_nli} not implemented.')
return carrier_nli
@@ -624,14 +482,16 @@ class NliSolver:
carrier_nli = 0
for pump_carrier_1 in carriers:
for pump_carrier_2 in carriers:
carrier_nli += eta_matrix[pump_carrier_1.channel_number-1, pump_carrier_2.channel_number-1] * \
pump_carrier_1.power.signal * pump_carrier_2.power.signal
carrier_nli += eta_matrix[pump_carrier_1.channel_number - 1, pump_carrier_2.channel_number - 1] * \
pump_carrier_1.power.signal * pump_carrier_2.power.signal
carrier_nli *= carrier.power.signal
return carrier_nli
def _compute_eta_matrix(self, carrier_cut, *carriers):
cut_index = carrier_cut.channel_number - 1
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
# Matrix initialization
matrix_size = max(carriers, key=lambda x: getattr(x, 'channel_number')).channel_number
eta_matrix = np.zeros(shape=(matrix_size, matrix_size))
@@ -639,10 +499,10 @@ class NliSolver:
# SPM
logger.debug(f'Start computing SPM on channel #{carrier_cut.channel_number}')
# SPM GGN
if 'ggn' in self.nli_params.nli_method_name.lower():
if 'ggn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._generalized_spectrally_separated_spm(carrier_cut)
# SPM GN
elif 'gn' in self.nli_params.nli_method_name.lower():
elif 'gn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._gn_analytic(carrier_cut, *[carrier_cut])
eta_matrix[cut_index, cut_index] = partial_nli / (carrier_cut.power.signal**3)
@@ -653,13 +513,13 @@ class NliSolver:
logger.debug(f'Start computing XPM on channel #{carrier_cut.channel_number} '
f'from channel #{pump_carrier.channel_number}')
# XPM GGN
if 'ggn' in self.nli_params.nli_method_name.lower():
if 'ggn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._generalized_spectrally_separated_xpm(carrier_cut, pump_carrier)
# XPM GGN
elif 'gn' in self.nli_params.nli_method_name.lower():
elif 'gn' in sim_params.nli_params.nli_method_name.lower():
partial_nli = self._gn_analytic(carrier_cut, *[pump_carrier])
eta_matrix[pump_index, pump_index] = partial_nli /\
(carrier_cut.power.signal * pump_carrier.power.signal**2)
(carrier_cut.power.signal * pump_carrier.power.signal**2)
return eta_matrix
# Methods for computing GN-model
@@ -670,48 +530,52 @@ class NliSolver:
:param carriers: the full WDM comb
:return: carrier_nli: the amount of nonlinear interference in W on the carrier under analysis
"""
alpha = self.fiber_params.alpha0() / 2
beta2 = self.fiber_params.beta2
gamma = self.fiber_params.gamma
length = self.fiber_params.length
effective_length = (1 - np.exp(-2 * alpha * length)) / (2 * alpha)
asymptotic_length = 1 / (2 * alpha)
beta2 = self.fiber.params.beta2
gamma = self.fiber.params.gamma
effective_length = self.fiber.params.effective_length
asymptotic_length = self.fiber.params.asymptotic_length
g_nli = 0
for interfering_carrier in carriers:
g_interfearing = interfering_carrier.power.signal / interfering_carrier.baud_rate
g_signal = carrier.power.signal / carrier.baud_rate
g_nli += g_interfearing**2 * g_signal \
* _psi(carrier, interfering_carrier, beta2=self.fiber_params.beta2, asymptotic_length=1/self.fiber_params.alpha0())
g_nli *= (16.0 / 27.0) * (gamma * effective_length)**2 /\
* _psi(carrier, interfering_carrier, beta2=beta2, asymptotic_length=asymptotic_length)
g_nli *= (16.0 / 27.0) * (gamma * effective_length) ** 2 /\
(2 * np.pi * abs(beta2) * asymptotic_length)
carrier_nli = carrier.baud_rate * g_nli
return carrier_nli
# Methods for computing the GGN-model
def _generalized_spectrally_separated_spm(self, carrier):
f_cut_resolution = self.nli_params.f_cut_resolution['delta_0']
gamma = self.fiber.params.gamma
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
f_cut_resolution = sim_params.nli_params.f_cut_resolution['delta_0']
f_eval = carrier.frequency
g_cut = (carrier.power.signal / carrier.baud_rate)
spm_nli = carrier.baud_rate * (16.0 / 27.0) * self.fiber_params.gamma**2 * g_cut**3 * \
self._generalized_psi(carrier, carrier, f_eval, f_cut_resolution, f_cut_resolution)
spm_nli = carrier.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_cut ** 3 * \
self._generalized_psi(carrier, carrier, f_eval, f_cut_resolution, f_cut_resolution)
return spm_nli
def _generalized_spectrally_separated_xpm(self, carrier_cut, pump_carrier):
gamma = self.fiber.params.gamma
simulation = Simulation.get_simulation()
sim_params = simulation.sim_params
delta_index = pump_carrier.channel_number - carrier_cut.channel_number
f_cut_resolution = self.nli_params.f_cut_resolution[f'delta_{delta_index}']
f_pump_resolution = self.nli_params.f_pump_resolution
f_cut_resolution = sim_params.nli_params.f_cut_resolution[f'delta_{delta_index}']
f_pump_resolution = sim_params.nli_params.f_pump_resolution
f_eval = carrier_cut.frequency
g_pump = (pump_carrier.power.signal / pump_carrier.baud_rate)
g_cut = (carrier_cut.power.signal / carrier_cut.baud_rate)
frequency_offset_threshold = self._frequency_offset_threshold(pump_carrier.baud_rate)
if abs(carrier_cut.frequency - pump_carrier.frequency) <= frequency_offset_threshold:
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * self.fiber_params.gamma**2 * g_pump**2 * g_cut * \
2 * self._generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution, f_pump_resolution)
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_pump**2 * g_cut * \
2 * self._generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution, f_pump_resolution)
else:
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * self.fiber_params.gamma**2 * g_pump**2 * g_cut * \
2 * self._fast_generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution)
xpm_nli = carrier_cut.baud_rate * (16.0 / 27.0) * gamma ** 2 * g_pump**2 * g_cut * \
2 * self._fast_generalized_psi(carrier_cut, pump_carrier, f_eval, f_cut_resolution)
return xpm_nli
def _fast_generalized_psi(self, carrier_cut, pump_carrier, f_eval, f_cut_resolution):
@@ -719,15 +583,15 @@ class NliSolver:
:return: generalized_psi
"""
# Fiber parameters
alpha0 = self.fiber_params.alpha0(f_eval)
beta2 = self.fiber_params.beta2
beta3 = self.fiber_params.beta3
f_ref_beta = self.fiber_params.f_ref_beta
alpha0 = self.fiber.alpha0(f_eval)
beta2 = self.fiber.params.beta2
beta3 = self.fiber.params.beta3
f_ref_beta = self.fiber.params.ref_frequency
z = self.stimulated_raman_scattering.z
frequency_rho = self.stimulated_raman_scattering.frequency
rho_norm = self.stimulated_raman_scattering.rho * np.exp(np.abs(alpha0) * z / 2)
if len(frequency_rho) == 1:
rho_function = lambda f: rho_norm[0, :]
def rho_function(f): return rho_norm[0, :]
else:
rho_function = interp1d(frequency_rho, rho_norm, axis=0, fill_value='extrapolate')
rho_norm_pump = rho_function(pump_carrier.frequency)
@@ -741,7 +605,7 @@ class NliSolver:
integrand_f1 = np.zeros(len(f1_array))
for f1_index, f1 in enumerate(f1_array):
delta_beta = 4 * np.pi**2 * (f1 - f_eval) * (f2_array - f_eval) * \
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
integrand_f2 = self._generalized_rho_nli(delta_beta, rho_norm_pump, z, alpha0)
integrand_f1[f1_index] = 2 * np.trapz(integrand_f2, f2_array) # 2x since integrand_f2 is symmetric in f2
generalized_psi = 0.5 * sum(integrand_f1) * pump_carrier.baud_rate
@@ -752,15 +616,15 @@ class NliSolver:
:return: generalized_psi
"""
# Fiber parameters
alpha0 = self.fiber_params.alpha0(f_eval)
beta2 = self.fiber_params.beta2
beta3 = self.fiber_params.beta3
f_ref_beta = self.fiber_params.f_ref_beta
alpha0 = self.fiber.alpha0(f_eval)
beta2 = self.fiber.params.beta2
beta3 = self.fiber.params.beta3
f_ref_beta = self.fiber.params.ref_frequency
z = self.stimulated_raman_scattering.z
frequency_rho = self.stimulated_raman_scattering.frequency
rho_norm = self.stimulated_raman_scattering.rho * np.exp(np.abs(alpha0) * z / 2)
if len(frequency_rho) == 1:
rho_function = lambda f: rho_norm[0, :]
def rho_function(f): return rho_norm[0, :]
else:
rho_function = interp1d(frequency_rho, rho_norm, axis=0, fill_value='extrapolate')
rho_norm_pump = rho_function(pump_carrier.frequency)
@@ -781,7 +645,7 @@ class NliSolver:
ggg = psd1_sample * psd2 * psd3
delta_beta = 4 * np.pi**2 * (f1 - f_eval) * (f2_array - f_eval) * \
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
(beta2 + np.pi * beta3 * (f1 + f2_array - 2 * f_ref_beta))
integrand_f2 = ggg * self._generalized_rho_nli(delta_beta, rho_norm_pump, z, alpha0)
integrand_f1[f1_index] = np.trapz(integrand_f2, f2_array)
@@ -803,18 +667,66 @@ class NliSolver:
beta2_ref = 21.3e-27
delta_f_ref = 50e9
rs_ref = 32e9
freq_offset_th = ((k_ref * delta_f_ref) * rs_ref * beta2_ref) / (self.fiber_params.beta2 * symbol_rate)
beta2 = abs(self.fiber.params.beta2)
freq_offset_th = ((k_ref * delta_f_ref) * rs_ref * beta2_ref) / (beta2 * symbol_rate)
return freq_offset_th
def _psi(carrier, interfering_carrier, beta2, asymptotic_length):
"""Calculates eq. 123 from `arXiv:1209.0394 <https://arxiv.org/abs/1209.0394>`__"""
if carrier.channel_number == interfering_carrier.channel_number: # SCI, SPM
if carrier.channel_number == interfering_carrier.channel_number: # SCI, SPM
psi = np.arcsinh(0.5 * np.pi**2 * asymptotic_length * abs(beta2) * carrier.baud_rate**2)
else: # XCI, XPM
else: # XCI, XPM
delta_f = carrier.frequency - interfering_carrier.frequency
psi = np.arcsinh(np.pi**2 * asymptotic_length * abs(beta2) *
carrier.baud_rate * (delta_f + 0.5 * interfering_carrier.baud_rate))
psi -= np.arcsinh(np.pi**2 * asymptotic_length * abs(beta2) *
carrier.baud_rate * (delta_f - 0.5 * interfering_carrier.baud_rate))
return psi
def estimate_nf_model(type_variety, gain_min, gain_max, nf_min, nf_max):
if nf_min < -10:
raise EquipmentConfigError(f'Invalid nf_min value {nf_min!r} for amplifier {type_variety}')
if nf_max < -10:
raise EquipmentConfigError(f'Invalid nf_max value {nf_max!r} for amplifier {type_variety}')
# NF estimation model based on nf_min and nf_max
# delta_p: max power dB difference between first and second stage coils
# dB g1a: first stage gain - internal VOA attenuation
# nf1, nf2: first and second stage coils
# calculated by solving nf_{min,max} = nf1 + nf2 / g1a{min,max}
delta_p = 5
g1a_min = gain_min - (gain_max - gain_min) - delta_p
g1a_max = gain_max - delta_p
nf2 = lin2db((db2lin(nf_min) - db2lin(nf_max)) /
(1 / db2lin(g1a_max) - 1 / db2lin(g1a_min)))
nf1 = lin2db(db2lin(nf_min) - db2lin(nf2) / db2lin(g1a_max))
if nf1 < 4:
raise EquipmentConfigError(f'First coil value too low {nf1} for amplifier {type_variety}')
# Check 1 dB < delta_p < 6 dB to ensure nf_min and nf_max values make sense.
# There shouldn't be high nf differences between the two coils:
# nf2 should be nf1 + 0.3 < nf2 < nf1 + 2
# If not, recompute and check delta_p
if not nf1 + 0.3 < nf2 < nf1 + 2:
nf2 = np.clip(nf2, nf1 + 0.3, nf1 + 2)
g1a_max = lin2db(db2lin(nf2) / (db2lin(nf_min) - db2lin(nf1)))
delta_p = gain_max - g1a_max
g1a_min = gain_min - (gain_max - gain_min) - delta_p
if not 1 < delta_p < 11:
raise EquipmentConfigError(f'Computed \N{greek capital letter delta}P invalid \
\n 1st coil vs 2nd coil calculated DeltaP {delta_p:.2f} for \
\n amplifier {type_variety} is not valid: revise inputs \
\n calculated 1st coil NF = {nf1:.2f}, 2nd coil NF = {nf2:.2f}')
# Check calculated values for nf1 and nf2
calc_nf_min = lin2db(db2lin(nf1) + db2lin(nf2) / db2lin(g1a_max))
if not isclose(nf_min, calc_nf_min, abs_tol=0.01):
raise EquipmentConfigError(f'nf_min does not match calc_nf_min, {nf_min} vs {calc_nf_min} for amp {type_variety}')
calc_nf_max = lin2db(db2lin(nf1) + db2lin(nf2) / db2lin(g1a_min))
if not isclose(nf_max, calc_nf_max, abs_tol=0.01):
raise EquipmentConfigError(f'nf_max does not match calc_nf_max, {nf_max} vs {calc_nf_max} for amp {type_variety}')
return nf1, nf2, delta_p

View File

@@ -1,256 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.core.service_sheet
========================
XLS parser that can be called to create a JSON request file in accordance with
Yang model for requesting path computation.
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from sys import exit
try:
from xlrd import open_workbook, XL_CELL_EMPTY
except ModuleNotFoundError:
exit('Required: `pip install xlrd`')
from collections import namedtuple
from logging import getLogger, basicConfig, CRITICAL, DEBUG, INFO
from json import dumps
from pathlib import Path
from gnpy.core.equipment import load_equipment
from gnpy.core.utils import db2lin, lin2db
SERVICES_COLUMN = 12
#EQPT_LIBRARY_FILENAME = Path(__file__).parent / 'eqpt_config.json'
all_rows = lambda sheet, start=0: (sheet.row(x) for x in range(start, sheet.nrows))
logger = getLogger(__name__)
# Type for input data
class Request(namedtuple('Request', 'request_id source destination trx_type mode \
spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
def __new__(cls, request_id, source, destination, trx_type, mode=None , spacing= None , power = None, nb_channel = None , disjoint_from ='' , nodes_list = None, is_loose = '', path_bandwidth = None):
return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)
# Type for output data: // from dutc
class Element:
def __eq__(self, other):
return type(self) == type(other) and self.uid == other.uid
def __hash__(self):
return hash((type(self), self.uid))
class Request_element(Element):
def __init__(self,Request,eqpt_filename):
# request_id is str
# excel has automatic number formatting that adds .0 on integer values
# the next lines recover the pure int value, assuming this .0 is unwanted
self.request_id = correct_xlrd_int_to_str_reading(Request.request_id)
self.source = Request.source
self.destination = Request.destination
# TODO: the automatic naming generated by excel parser requires that source and dest name
# be a string starting with 'trx' : this is manually added here.
self.srctpid = f'trx {Request.source}'
self.dsttpid = f'trx {Request.destination}'
# test that trx_type belongs to eqpt_config.json
# if not replace it with a default
equipment = load_equipment(eqpt_filename)
try :
if equipment['Transceiver'][Request.trx_type]:
self.trx_type = correct_xlrd_int_to_str_reading(Request.trx_type)
if Request.mode is not None :
Requestmode = correct_xlrd_int_to_str_reading(Request.mode)
if [mode for mode in equipment['Transceiver'][Request.trx_type].mode if mode['format'] == Requestmode]:
self.mode = Requestmode
else :
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Requestmode}\' in eqpt library \nComputation stopped.'
#print(msg)
logger.critical(msg)
exit(1)
else:
Requestmode = None
self.mode = Request.mode
except KeyError:
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Request.mode}\' in eqpt library \nComputation stopped.'
#print(msg)
logger.critical(msg)
exit()
# excel input are in GHz and dBm
if Request.spacing is not None:
self.spacing = Request.spacing * 1e9
else:
msg = f'Request {self.request_id} missing spacing: spacing is mandatory.\ncomputation stopped'
logger.critical(msg)
exit()
if Request.power is not None:
self.power = db2lin(Request.power) * 1e-3
else:
self.power = None
if Request.nb_channel is not None :
self.nb_channel = int(Request.nb_channel)
else:
self.nb_channel = None
value = correct_xlrd_int_to_str_reading(Request.disjoint_from)
self.disjoint_from = [n for n in value.split(' | ') if value]
self.nodes_list = []
if Request.nodes_list :
self.nodes_list = Request.nodes_list.split(' | ')
# cleaning the list of nodes to remove source and destination
# (because the remaining of the program assumes that the nodes list are nodes
# on the path and should not include source and destination)
try :
self.nodes_list.remove(self.source)
msg = f'{self.source} removed from explicit path node-list'
logger.info(msg)
except ValueError:
msg = f'{self.source} already removed from explicit path node-list'
logger.info(msg)
try :
self.nodes_list.remove(self.destination)
msg = f'{self.destination} removed from explicit path node-list'
logger.info(msg)
except ValueError:
msg = f'{self.destination} already removed from explicit path node-list'
logger.info(msg)
# the excel parser applies the same hop-type to all nodes in the route nodes_list.
# user can change this per node in the generated json
self.loose = 'loose'
if Request.is_loose == 'no' :
self.loose = 'strict'
self.path_bandwidth = None
if Request.path_bandwidth is not None:
self.path_bandwidth = Request.path_bandwidth * 1e9
else:
self.path_bandwidth = 0
uid = property(lambda self: repr(self))
@property
def pathrequest(self):
req_dictionnary = {
'request-id':self.request_id,
'source': self.source,
'destination': self.destination,
'src-tp-id': self.srctpid,
'dst-tp-id': self.dsttpid,
'path-constraints':{
'te-bandwidth': {
'technology': 'flexi-grid',
'trx_type' : self.trx_type,
'trx_mode' : self.mode,
'effective-freq-slot':[{'n': 'null','m': 'null'}] ,
'spacing' : self.spacing,
'max-nb-of-channel' : self.nb_channel,
'output-power' : self.power
# 'path_bandwidth' : self.path_bandwidth
}
},
'optimizations': {
'explicit-route-include-objects': [
{
'index': self.nodes_list.index(node),
'unnumbered-hop':{
'node-id': f'{node}',
'link-tp-id': 'link-tp-id is not used',
'hop-type': f'{self.loose}',
'direction': 'direction is not used'
},
'label-hop':{
'te-label': {
'generic': 'generic is not used',
'direction': 'direction is not used'
}
}
}
for node in self.nodes_list
]
}
}
if self.path_bandwidth is not None:
req_dictionnary['path-constraints']['te-bandwidth']['path_bandwidth'] = self.path_bandwidth
return req_dictionnary
@property
def pathsync(self):
if self.disjoint_from :
return {'synchronization-id':self.request_id,
'svec': {
'relaxable' : 'False',
'link-diverse': 'True',
'node-diverse': 'True',
'request-id-number': [self.request_id]+ [n for n in self.disjoint_from]
}
}
# TO-DO: avoid multiple entries with same synchronisation vectors
@property
def json(self):
return self.pathrequest , self.pathsync
def convert_service_sheet(input_filename, eqpt_filename, output_filename='', filter_region=[]):
service = parse_excel(input_filename)
req = [Request_element(n,eqpt_filename) for n in service]
# dumps the output into a json file with name
# split_filename = [input_filename[0:len(input_filename)-len(suffix_filename)] , suffix_filename[1:]]
if output_filename=='':
output_filename = f'{str(input_filename)[0:len(str(input_filename))-len(str(input_filename.suffixes[0]))]}_services.json'
# for debug
# print(json_filename)
data = {
'path-request': [n.json[0] for n in req],
'synchronization': [n.json[1] for n in req
if n.json[1] is not None]
}
with open(output_filename, 'w', encoding='utf-8') as f:
f.write(dumps(data, indent=2, ensure_ascii=False))
return data
def correct_xlrd_int_to_str_reading(v) :
if not isinstance(v,str):
value = str(int(v))
if value.endswith('.0'):
value = value[:-2]
else:
value = v
return value
# to be used from dutc
def parse_row(row, fieldnames):
return {f: r.value for f, r in zip(fieldnames, row[0:SERVICES_COLUMN])
if r.ctype != XL_CELL_EMPTY}
#
def parse_excel(input_filename):
with open_workbook(input_filename) as wb:
service_sheet = wb.sheet_by_name('Service')
services = list(parse_service_sheet(service_sheet))
return services
def parse_service_sheet(service_sheet):
logger.info(f'Validating headers on {service_sheet.name!r}')
# add a test on field to enable the '' field case that arises when columns on the
# right hand side are used as comments or drawing in the excel sheet
header = [x.value.strip() for x in service_sheet.row(4)[0:SERVICES_COLUMN] if len(x.value.strip())>0]
# create a service_fieldname independant from the excel column order
# to be compatible with any version of the sheet
# the following dictionnary records the excel field names and the corresponding parameter's name
authorized_fieldnames = {'route id':'request_id', 'Source':'source', 'Destination':'destination', \
'TRX type':'trx_type', 'Mode' : 'mode', 'System: spacing':'spacing', \
'System: input power (dBm)':'power', 'System: nb of channels':'nb_channel',\
'routing: disjoint from': 'disjoint_from', 'routing: path':'nodes_list',\
'routing: is loose?':'is_loose', 'path bandwidth':'path_bandwidth'}
try :
service_fieldnames = [authorized_fieldnames[e] for e in header]
except KeyError:
msg = f'Malformed header on Service sheet: {header} field not in {authorized_fieldnames}'
logger.critical(msg)
raise ValueError(msg)
for row in all_rows(service_sheet, start=5):
yield Request(**parse_row(row[0:SERVICES_COLUMN], service_fieldnames))

View File

@@ -1,5 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
UNITS = {'m': 1,
'km': 1E3}

View File

@@ -9,30 +9,19 @@ This module contains utility functions that are used with gnpy.
'''
import json
from csv import writer
import numpy as np
from numpy import pi, cos, sqrt, log10
from scipy import constants
from gnpy.core.exceptions import ConfigurationError
def load_json(filename):
with open(filename, 'r', encoding='utf-8') as f:
data = json.load(f)
return data
def save_json(obj, filename):
with open(filename, 'w', encoding='utf-8') as f:
json.dump(obj, f, indent=2, ensure_ascii=False)
def write_csv(obj, filename):
"""
convert dictionary items to a csv file
the dictionary format :
Convert dictionary items to a CSV file the dictionary format:
::
{'result category 1':
{'result category 1':
[
# 1st line of results
{'header 1' : value_xxx,
@@ -41,82 +30,83 @@ def write_csv(obj, filename):
{'header 1' : value_www,
'header 2' : value_zzz}
],
'result_category 2':
'result_category 2':
[
{},{}
]
}
}
the generated csv file will be:
result_category 1
header 1 header 2
value_xxx value_yyy
value_www value_zzz
result_category 2
...
The generated csv file will be:
::
result_category 1
header 1 header 2
value_xxx value_yyy
value_www value_zzz
result_category 2
...
"""
with open(filename, 'w', encoding='utf-8') as f:
w = writer(f)
for data_key, data_list in obj.items():
#main header
# main header
w.writerow([data_key])
#sub headers:
# sub headers:
headers = [_ for _ in data_list[0].keys()]
w.writerow(headers)
for data_dict in data_list:
w.writerow([_ for _ in data_dict.values()])
def c():
"""
Returns the speed of light in meters per second
"""
return constants.c
def itufs(spacing, startf=191.35, stopf=196.10):
"""Creates an array of frequencies whose default range is
191.35-196.10 THz
:param spacing: Frequency spacing in THz
:param starf: Start frequency in THz
:param stopf: Stop frequency in THz
:type spacing: float
:type startf: float
:type stopf: float
:return an array of frequnecies determined by the spacing parameter
:rtype: numpy.ndarray
"""
return np.arange(startf, stopf + spacing / 2, spacing)
def itufl(length, startf=191.35, stopf=196.10):
"""Creates an array of frequencies whose default range is
191.35-196.10 THz
def arrange_frequencies(length, start, stop):
"""Create an array of frequencies
:param length: number of elements
:param starf: Start frequency in THz
:param stopf: Stop frequency in THz
:param start: Start frequency in THz
:param stop: Stop frequency in THz
:type length: integer
:type startf: float
:type stopf: float
:return an array of frequnecies determined by the spacing parameter
:type start: float
:type stop: float
:return: an array of frequencies determined by the spacing parameter
:rtype: numpy.ndarray
"""
return np.linspace(startf, stopf, length)
def h():
"""
Returns plank's constant in J*s
"""
return constants.h
return np.linspace(start, stop, length)
def lin2db(value):
"""Convert linear unit to logarithmic (dB)
>>> lin2db(0.001)
-30.0
>>> round(lin2db(1.0), 2)
0.0
>>> round(lin2db(1.26), 2)
1.0
>>> round(lin2db(10.0), 2)
10.0
>>> round(lin2db(100.0), 2)
20.0
"""
return 10 * log10(value)
def db2lin(value):
"""Convert logarithimic units to linear
>>> round(db2lin(10.0), 2)
10.0
>>> round(db2lin(20.0), 2)
100.0
>>> round(db2lin(1.0), 2)
1.26
>>> round(db2lin(0.0), 2)
1.0
>>> round(db2lin(-10.0), 2)
0.1
"""
return 10**(value / 10)
def round2float(number, step):
step = round(step, 1)
if step >= 0.01:
@@ -126,19 +116,28 @@ def round2float(number, step):
number = round(number, 2)
return number
wavelength2freq = constants.lambda2nu
freq2wavelength = constants.nu2lambda
def freq2wavelength(value):
""" Converts frequency units to wavelength units.
>>> round(freq2wavelength(191.35e12) * 1e9, 3)
1566.723
>>> round(freq2wavelength(196.1e12) * 1e9, 3)
1528.773
"""
return c() / value
return constants.c / value
def snr_sum(snr, bw, snr_added, bw_added=12.5e9):
snr_added = snr_added - lin2db(bw/bw_added)
snr = -lin2db(db2lin(-snr)+db2lin(-snr_added))
snr_added = snr_added - lin2db(bw / bw_added)
snr = -lin2db(db2lin(-snr) + db2lin(-snr_added))
return snr
def deltawl2deltaf(delta_wl, wavelength):
""" deltawl2deltaf(delta_wl, wavelength):
delta_wl is BW in wavelength units
@@ -200,6 +199,7 @@ def rrc(ffs, baud_rate, alpha):
hf[p_inds] = 1
return sqrt(hf)
def merge_amplifier_restrictions(dict1, dict2):
"""Updates contents of dicts recursively
@@ -222,6 +222,7 @@ def merge_amplifier_restrictions(dict1, dict2):
copy_dict1[key] = dict2[key]
return copy_dict1
def silent_remove(this_list, elem):
"""Remove matching elements from a list without raising ValueError
@@ -239,3 +240,59 @@ def silent_remove(this_list, elem):
except ValueError:
pass
return this_list
def automatic_nch(f_min, f_max, spacing):
"""How many channels are available in the spectrum
:param f_min Lowest frequenecy [Hz]
:param f_max Highest frequency [Hz]
:param spacing Channel width [Hz]
:return Number of uniform channels
>>> automatic_nch(191.325e12, 196.125e12, 50e9)
96
>>> automatic_nch(193.475e12, 193.525e12, 50e9)
1
"""
return int((f_max - f_min) // spacing)
def automatic_fmax(f_min, spacing, nch):
"""Find the high-frequenecy boundary of a spectrum
:param f_min Start of the spectrum (lowest frequency edge) [Hz]
:param spacing Grid/channel spacing [Hz]
:param nch Number of channels
:return End of the spectrum (highest frequency) [Hz]
>>> automatic_fmax(191.325e12, 50e9, 96)
196125000000000.0
"""
return f_min + spacing * nch
def convert_length(value, units):
"""Convert length into basic SI units
>>> convert_length(1, 'km')
1000.0
>>> convert_length(2.0, 'km')
2000.0
>>> convert_length(123, 'm')
123.0
>>> convert_length(123.0, 'm')
123.0
>>> convert_length(42.1, 'km')
42100.0
>>> convert_length(666, 'yards')
Traceback (most recent call last):
...
gnpy.core.exceptions.ConfigurationError: Cannot convert length in "yards" into meters
"""
if units == 'm':
return value * 1e0
elif units == 'km':
return value * 1e3
else:
raise ConfigurationError(f'Cannot convert length in "{units}" into meters')

View File

@@ -11,20 +11,22 @@ If not present in the "Nodes" sheet, the "Type" column will be implicitly
determined based on the topology.
"""
try:
from xlrd import open_workbook
except ModuleNotFoundError:
exit('Required: `pip install xlrd`')
from xlrd import open_workbook
from argparse import ArgumentParser
PARSER = ArgumentParser()
PARSER.add_argument('workbook', nargs='?', default='meshTopologyExampleV2.xls',
help='create the mandatory columns in Eqpt sheet')
ALL_ROWS = lambda sh, start=0: (sh.row(x) for x in range(start, sh.nrows))
def ALL_ROWS(sh, start=0):
return (sh.row(x) for x in range(start, sh.nrows))
class Node:
""" Node element contains uid, list of connected nodes and eqpt type
"""
def __init__(self, uid, to_node):
self.uid = uid
self.to_node = to_node
@@ -36,6 +38,7 @@ class Node:
def __str__(self):
return f'uid {self.uid} \nto_node {[node for node in self.to_node]}\neqpt {self.eqpt}\n'
def read_excel(input_filename):
""" read excel Nodes and Links sheets and create a dict of nodes with
their to_nodes and type of eqpt
@@ -73,6 +76,7 @@ def read_excel(input_filename):
exit()
return nodes
def create_eqt_template(nodes, input_filename):
""" writes list of node A node Z corresponding to Nodes and Links sheets in order
to help user populating Eqpt
@@ -85,7 +89,6 @@ def create_eqt_template(nodes, input_filename):
\nNode A \tNode Z \tamp type \tatt_in \tamp gain \ttilt \tatt_out\
amp type \tatt_in \tamp gain \ttilt \tatt_out\n')
for node in nodes.values():
if node.eqpt == 'ILA':
my_file.write(f'{node.uid}\t{node.to_node[0]}\n')
@@ -93,8 +96,8 @@ def create_eqt_template(nodes, input_filename):
for to_node in node.to_node:
my_file.write(f'{node.uid}\t{to_node}\n')
print(f'File {output_filename} successfully created with Node A - Node Z ' +
' entries for Eqpt sheet in excel file.')
print(f'File {output_filename} successfully created with Node A - Node Z entries for Eqpt sheet in excel file.')
if __name__ == '__main__':
ARGS = PARSER.parse_args()

View File

@@ -1,198 +1,8 @@
{
"nf_ripple": [
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0
],
"gain_ripple": [
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0
],
"dgt": [
@@ -293,4 +103,4 @@
1.017807767853702,
1.0
]
}
}

View File

@@ -8,7 +8,7 @@ Amplifier models and configuration
Equipment description defines equipment types and parameters.
It takes place in the default **eqpt_config.json** file.
By default **transmission_main_example.py** uses **eqpt_config.json** file and that
By default **gnpy-transmission-example** uses **eqpt_config.json** file and that
can be changed with **-e** or **--equipment** command line parameter.
2. Amplifier parameters and subtypes
@@ -266,7 +266,7 @@ In an opensource and multi-vendor environnement, it is needed to support differe
4. advanced_config_from_json
#######################################
The build_oa_json.py library in gnpy/examples/edfa_model can be used to build the json file required for the amplifier advanced_model type_def:
The build_oa_json.py library in ``gnpy/example-data/edfa_model/`` can be used to build the json file required for the amplifier advanced_model type_def:
Update an existing json file with all the 96ch txt files for a given amplifier type
amplifier type 'OA_type1' is hard coded but can be modified and other types added

View File

@@ -13,7 +13,6 @@ import re
import sys
import json
import numpy as np
from gnpy.core.utils import lin2db, db2lin
"""amplifier file names
convert a set of amplifier files + input json definiton file into a valid edfa_json_file:
@@ -28,60 +27,63 @@ input json file in argument (defult = 'OA.json')
the json input file should have the following fields:
{
"nf_fit_coeff": "nf_filename.txt",
"nf_ripple": "nf_ripple_filename.txt",
"nf_ripple": "nf_ripple_filename.txt",
"gain_ripple": "DFG_filename.txt",
"dgt": "DGT_filename.txt",
}
"""
input_json_file_name = "OA.json" #default path
input_json_file_name = "OA.json" # default path
output_json_file_name = "default_edfa_config.json"
gain_ripple_field = "gain_ripple"
nf_ripple_field = "nf_ripple"
nf_fit_coeff = "nf_fit_coeff"
def read_file(field, file_name):
"""read and format the 96 channels txt files describing the amplifier NF and ripple
convert dfg into gain ripple by removing the mean component
"""
#with open(path + file_name,'r') as this_file:
# with open(path + file_name,'r') as this_file:
# data = this_file.read()
#data.strip()
# data.strip()
#data = re.sub(r"([0-9])([ ]{1,3})([0-9-+])",r"\1,\3",data)
#data = list(data.split(","))
#data = [float(x) for x in data]
data = np.loadtxt(file_name)
print(len(data), file_name)
if field == gain_ripple_field or field == nf_ripple_field:
#consider ripple excursion only to avoid redundant information
#because the max flat_gain is already given by the 'gain_flat' field in json
#remove the mean component
# consider ripple excursion only to avoid redundant information
# because the max flat_gain is already given by the 'gain_flat' field in json
# remove the mean component
print(file_name, ', mean value =', data.mean(), ' is substracted')
data = data - data.mean()
data = data.tolist()
return data
def input_json(path):
"""read the json input file and add all the 96 channels txt files
create the output json file with output_json_file_name"""
with open(path,'r') as edfa_json_file:
with open(path, 'r') as edfa_json_file:
amp_text = edfa_json_file.read()
amp_dict = json.loads(amp_text)
for k, v in amp_dict.items():
if re.search(r'.txt$',str(v)) :
if re.search(r'.txt$', str(v)):
amp_dict[k] = read_file(k, v)
amp_text = json.dumps(amp_dict, indent=4)
#print(amp_text)
with open(output_json_file_name,'w') as edfa_json_file:
# print(amp_text)
with open(output_json_file_name, 'w') as edfa_json_file:
edfa_json_file.write(amp_text)
if __name__ == '__main__':
if len(sys.argv) == 2:
path = sys.argv[1]
else:
path = input_json_file_name
input_json(path)
input_json(path)

View File

@@ -146,23 +146,27 @@
"Fiber":[{
"type_variety": "SSMF",
"dispersion": 1.67e-05,
"gamma": 0.00127
"gamma": 0.00127,
"pmd_coef": 1.265e-15
},
{
"type_variety": "NZDF",
"dispersion": 0.5e-05,
"gamma": 0.00146
"gamma": 0.00146,
"pmd_coef": 1.265e-15
},
{
"type_variety": "LOF",
"dispersion": 2.2e-05,
"gamma": 0.000843
"gamma": 0.000843,
"pmd_coef": 1.265e-15
}
],
"RamanFiber":[{
"type_variety": "SSMF",
"dispersion": 1.67e-05,
"gamma": 0.00127,
"pmd_coef": 1.265e-15,
"raman_efficiency": {
"cr":[
0, 9.4E-06, 2.92E-05, 4.88E-05, 6.82E-05, 8.31E-05, 9.4E-05, 0.0001014, 0.0001069, 0.0001119,
@@ -206,6 +210,7 @@
"Roadm":[{
"target_pch_out_db": -20,
"add_drop_osnr": 38,
"pmd": 0,
"restrictions": {
"preamp_variety_list":[],
"booster_variety_list":[]

View File

@@ -643,44 +643,6 @@
"out_voa": null
}
},
{
"uid": "east edfa in Corlay to Loudeac",
"metadata": {
"location": {
"city": "Corlay",
"region": "RLD",
"latitude": 2.0,
"longitude": 1.0
}
},
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": null,
"delta_p": 1.0,
"tilt_target": 0,
"out_voa": null
}
},
{
"uid": "east edfa in Loudeac to Lorient_KMA",
"metadata": {
"location": {
"city": "Loudeac",
"region": "RLD",
"latitude": 2.0,
"longitude": 2.0
}
},
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": null,
"delta_p": 1.0,
"tilt_target": 0,
"out_voa": null
}
},
{
"uid": "east edfa in Lannion_CAS to Stbrieuc",
"metadata": {
@@ -833,44 +795,6 @@
"out_voa": null
}
},
{
"uid": "west edfa in Corlay to Loudeac",
"metadata": {
"location": {
"city": "Corlay",
"region": "RLD",
"latitude": 2.0,
"longitude": 1.0
}
},
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": null,
"delta_p": 1.0,
"tilt_target": 0,
"out_voa": null
}
},
{
"uid": "west edfa in Loudeac to Lorient_KMA",
"metadata": {
"location": {
"city": "Loudeac",
"region": "RLD",
"latitude": 2.0,
"longitude": 2.0
}
},
"type": "Edfa",
"type_variety": "std_low_gain",
"operational": {
"gain_target": null,
"delta_p": 1.0,
"tilt_target": 0,
"out_voa": null
}
},
{
"uid": "west edfa in Lorient_KMA to Vannes_KBE",
"metadata": {

View File

@@ -2,10 +2,11 @@
"path-request": [
{
"request-id": "0",
"source": "Lorient_KMA",
"destination": "Vannes_KBE",
"source": "trx Lorient_KMA",
"destination": "trx Vannes_KBE",
"src-tp-id": "trx Lorient_KMA",
"dst-tp-id": "trx Vannes_KBE",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -13,8 +14,8 @@
"trx_mode": null,
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 50000000000.0,
@@ -22,17 +23,15 @@
"output-power": 0.0012589254117941673,
"path_bandwidth": 100000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "1",
"source": "Brest_KLA",
"destination": "Vannes_KBE",
"source": "trx Brest_KLA",
"destination": "trx Vannes_KBE",
"src-tp-id": "trx Brest_KLA",
"dst-tp-id": "trx Vannes_KBE",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -40,8 +39,8 @@
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 50000000000.0,
@@ -50,66 +49,42 @@
"path_bandwidth": 200000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": [
"explicit-route-objects": {
"route-object-include-exclude": [
{
"explicit-route-usage": "route-include-ero",
"index": 0,
"unnumbered-hop": {
"num-unnum-hop": {
"node-id": "roadm Brest_KLA",
"link-tp-id": "link-tp-id is not used",
"hop-type": "loose",
"direction": "direction is not used"
},
"label-hop": {
"te-label": {
"generic": "generic is not used",
"direction": "direction is not used"
}
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 1,
"unnumbered-hop": {
"num-unnum-hop": {
"node-id": "roadm Lannion_CAS",
"link-tp-id": "link-tp-id is not used",
"hop-type": "loose",
"direction": "direction is not used"
},
"label-hop": {
"te-label": {
"generic": "generic is not used",
"direction": "direction is not used"
}
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 2,
"unnumbered-hop": {
"num-unnum-hop": {
"node-id": "roadm Lorient_KMA",
"link-tp-id": "link-tp-id is not used",
"hop-type": "loose",
"direction": "direction is not used"
},
"label-hop": {
"te-label": {
"generic": "generic is not used",
"direction": "direction is not used"
}
"hop-type": "LOOSE"
}
},
{
"explicit-route-usage": "route-include-ero",
"index": 3,
"unnumbered-hop": {
"num-unnum-hop": {
"node-id": "roadm Vannes_KBE",
"link-tp-id": "link-tp-id is not used",
"hop-type": "loose",
"direction": "direction is not used"
},
"label-hop": {
"te-label": {
"generic": "generic is not used",
"direction": "direction is not used"
}
"hop-type": "LOOSE"
}
}
]
@@ -117,10 +92,11 @@
},
{
"request-id": "3",
"source": "Lannion_CAS",
"destination": "Rennes_STA",
"source": "trx Lannion_CAS",
"destination": "trx Rennes_STA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Rennes_STA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -128,8 +104,8 @@
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 50000000000.0,
@@ -137,17 +113,15 @@
"output-power": null,
"path_bandwidth": 60000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "4",
"source": "Rennes_STA",
"destination": "Lannion_CAS",
"source": "trx Rennes_STA",
"destination": "trx Lannion_CAS",
"src-tp-id": "trx Rennes_STA",
"dst-tp-id": "trx Lannion_CAS",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -155,8 +129,8 @@
"trx_mode": null,
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 75000000000.0,
@@ -164,17 +138,15 @@
"output-power": 0.0019952623149688794,
"path_bandwidth": 150000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "5",
"source": "Rennes_STA",
"destination": "Lannion_CAS",
"source": "trx Rennes_STA",
"destination": "trx Lannion_CAS",
"src-tp-id": "trx Rennes_STA",
"dst-tp-id": "trx Lannion_CAS",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -182,8 +154,8 @@
"trx_mode": "mode 2",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 75000000000.0,
@@ -191,17 +163,15 @@
"output-power": 0.0019952623149688794,
"path_bandwidth": 20000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "6",
"source": "Lannion_CAS",
"destination": "Lorient_KMA",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -209,8 +179,8 @@
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 50000000000.0,
@@ -218,17 +188,15 @@
"output-power": 0.001,
"path_bandwidth": 300000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "7",
"source": "Lannion_CAS",
"destination": "Lorient_KMA",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -236,8 +204,8 @@
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 50000000000.0,
@@ -245,17 +213,15 @@
"output-power": 0.001,
"path_bandwidth": 400000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
},
{
"request-id": "7b",
"source": "Lannion_CAS",
"destination": "Lorient_KMA",
"source": "trx Lannion_CAS",
"destination": "trx Lorient_KMA",
"src-tp-id": "trx Lannion_CAS",
"dst-tp-id": "trx Lorient_KMA",
"bidirectional": false,
"path-constraints": {
"te-bandwidth": {
"technology": "flexi-grid",
@@ -263,8 +229,8 @@
"trx_mode": "mode 1",
"effective-freq-slot": [
{
"n": "null",
"m": "null"
"N": null,
"M": null
}
],
"spacing": 75000000000.0,
@@ -272,9 +238,6 @@
"output-power": 0.001,
"path_bandwidth": 400000000000.0
}
},
"optimizations": {
"explicit-route-include-objects": []
}
}
],
@@ -282,9 +245,8 @@
{
"synchronization-id": "3",
"svec": {
"relaxable": "False",
"link-diverse": "True",
"node-diverse": "True",
"relaxable": "false",
"disjointness": "node link",
"request-id-number": [
"3",
"1"
@@ -294,9 +256,8 @@
{
"synchronization-id": "4",
"svec": {
"relaxable": "False",
"link-diverse": "True",
"node-diverse": "True",
"relaxable": "false",
"disjointness": "node link",
"request-id-number": [
"4",
"5"

View File

@@ -1,5 +1,4 @@
{
"raman_computed_channels": [1, 18, 37, 56, 75],
"raman_parameters": {
"flag_raman": true,
"space_resolution": 10e3,
@@ -9,6 +8,7 @@
"nli_method_name": "ggn_spectrally_separated",
"wdm_grid_size": 50e9,
"dispersion_tolerance": 1,
"phase_shift_tollerance": 0.1
"phase_shift_tolerance": 0.1,
"computed_channels": [1, 18, 37, 56, 75]
}
}

View File

@@ -14,14 +14,14 @@ See: draft-ietf-teas-yang-path-computation-01.txt
from argparse import ArgumentParser
from pathlib import Path
from json import loads
from gnpy.core.equipment import load_equipment
from gnpy.core.request import jsontocsv
from gnpy.tools.json_io import load_equipment
from gnpy.topology.request import jsontocsv
parser = ArgumentParser(description = 'A function that writes json path results in an excel sheet.')
parser.add_argument('filename', nargs='?', type = Path)
parser.add_argument('output_filename', nargs='?', type = Path)
parser.add_argument('eqpt_filename', nargs='?', type = Path, default=Path(__file__).parent / 'eqpt_config.json')
parser = ArgumentParser(description='A function that writes json path results in an excel sheet.')
parser.add_argument('filename', nargs='?', type=Path)
parser.add_argument('output_filename', nargs='?', type=Path)
parser.add_argument('eqpt_filename', nargs='?', type=Path, default=Path(__file__).parent / 'eqpt_config.json')
if __name__ == '__main__':
args = parser.parse_args()
@@ -32,5 +32,4 @@ if __name__ == '__main__':
json_data = loads(f.read())
equipment = load_equipment(args.eqpt_filename)
print(f'Writing in {args.output_filename}')
jsontocsv(json_data,equipment,file)
jsontocsv(json_data, equipment, file)

5
gnpy/tools/__init__.py Normal file
View File

@@ -0,0 +1,5 @@
'''
Processing of data via :py:mod:`.json_io`.
Utilities for Excel conversion in :py:mod:`.convert` and :py:mod:`.service_sheet`.
Example code in :py:mod:`.cli_examples` and :py:mod:`.plots`.
'''

434
gnpy/tools/cli_examples.py Normal file
View File

@@ -0,0 +1,434 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.cli_examples
=======================
Common code for CLI examples
'''
import argparse
from json import dumps
import logging
import os.path
import sys
from math import ceil
from numpy import linspace, mean
from pathlib import Path
import gnpy.core.ansi_escapes as ansi_escapes
from gnpy.core.elements import Transceiver, Fiber, RamanFiber
from gnpy.core.equipment import trx_mode_params
import gnpy.core.exceptions as exceptions
from gnpy.core.network import build_network
from gnpy.core.parameters import SimParams
from gnpy.core.science_utils import Simulation
from gnpy.core.utils import db2lin, lin2db, automatic_nch
from gnpy.topology.request import (ResultElement, jsontocsv, compute_path_dsjctn, requests_aggregation,
BLOCKING_NOPATH, correct_json_route_list,
deduplicate_disjunctions, compute_path_with_disjunction,
PathRequest, compute_constrained_path, propagate2)
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
from gnpy.tools.json_io import load_equipment, load_network, load_json, load_requests, save_network, \
requests_from_json, disjunctions_from_json, save_json
from gnpy.tools.plots import plot_baseline, plot_results
_logger = logging.getLogger(__name__)
_examples_dir = Path(__file__).parent.parent / 'example-data'
_help_footer = '''
This program is part of GNPy, https://github.com/TelecomInfraProject/oopt-gnpy
Learn more at https://gnpy.readthedocs.io/
'''
_help_fname_json = 'FILE.json'
_help_fname_json_csv = 'FILE.(json|csv)'
def show_example_data_dir():
print(f'{_examples_dir}/')
def load_common_data(equipment_filename, topology_filename, simulation_filename, save_raw_network_filename):
'''Load common configuration from JSON files'''
try:
equipment = load_equipment(equipment_filename)
network = load_network(topology_filename, equipment)
if save_raw_network_filename is not None:
save_network(network, save_raw_network_filename)
print(f'{ansi_escapes.blue}Raw network (no optimizations) saved to {save_raw_network_filename}{ansi_escapes.reset}')
sim_params = SimParams(**load_json(simulation_filename)) if simulation_filename is not None else None
if not sim_params:
if next((node for node in network if isinstance(node, RamanFiber)), None) is not None:
print(f'{ansi_escapes.red}Invocation error:{ansi_escapes.reset} '
f'RamanFiber requires passing simulation params via --sim-params')
sys.exit(1)
else:
Simulation.set_params(sim_params)
except exceptions.EquipmentConfigError as e:
print(f'{ansi_escapes.red}Configuration error in the equipment library:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.NetworkTopologyError as e:
print(f'{ansi_escapes.red}Invalid network definition:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ConfigurationError as e:
print(f'{ansi_escapes.red}Configuration error:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ParametersError as e:
print(f'{ansi_escapes.red}Simulation parameters error:{ansi_escapes.reset} {e}')
sys.exit(1)
except exceptions.ServiceError as e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
sys.exit(1)
return (equipment, network)
def _setup_logging(args):
logging.basicConfig(level={2: logging.DEBUG, 1: logging.INFO, 0: logging.CRITICAL}.get(args.verbose, logging.DEBUG))
def _add_common_options(parser: argparse.ArgumentParser, network_default: Path):
parser.add_argument('topology', nargs='?', type=Path, metavar='NETWORK-TOPOLOGY.(json|xls|xlsx)',
default=network_default,
help='Input network topology')
parser.add_argument('-v', '--verbose', action='count', default=0,
help='Increase verbosity (can be specified several times)')
parser.add_argument('-e', '--equipment', type=Path, metavar=_help_fname_json,
default=_examples_dir / 'eqpt_config.json', help='Equipment library')
parser.add_argument('--sim-params', type=Path, metavar=_help_fname_json,
default=None, help='Path to the JSON containing simulation parameters (required for Raman). '
f'Example: {_examples_dir / "sim_params.json"}')
parser.add_argument('--save-network', type=Path, metavar=_help_fname_json,
help='Save the final network as a JSON file')
parser.add_argument('--save-network-before-autodesign', type=Path, metavar=_help_fname_json,
help='Dump the network into a JSON file prior to autodesign')
def transmission_main_example(args=None):
parser = argparse.ArgumentParser(
description='Send a full spectrum load through the network from point A to point B',
epilog=_help_footer,
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
)
_add_common_options(parser, network_default=_examples_dir / 'edfa_example_network.json')
parser.add_argument('--show-channels', action='store_true', help='Show final per-channel OSNR summary')
parser.add_argument('-pl', '--plot', action='store_true')
parser.add_argument('-l', '--list-nodes', action='store_true', help='list all transceiver nodes')
parser.add_argument('-po', '--power', default=0, help='channel ref power in dBm')
parser.add_argument('source', nargs='?', help='source node')
parser.add_argument('destination', nargs='?', help='destination node')
args = parser.parse_args(args if args is not None else sys.argv[1:])
_setup_logging(args)
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
if args.plot:
plot_baseline(network)
transceivers = {n.uid: n for n in network.nodes() if isinstance(n, Transceiver)}
if not transceivers:
sys.exit('Network has no transceivers!')
if len(transceivers) < 2:
sys.exit('Network has only one transceiver!')
if args.list_nodes:
for uid in transceivers:
print(uid)
sys.exit()
# First try to find exact match if source/destination provided
if args.source:
source = transceivers.pop(args.source, None)
valid_source = True if source else False
else:
source = None
_logger.info('No source node specified: picking random transceiver')
if args.destination:
destination = transceivers.pop(args.destination, None)
valid_destination = True if destination else False
else:
destination = None
_logger.info('No destination node specified: picking random transceiver')
# If no exact match try to find partial match
if args.source and not source:
# TODO code a more advanced regex to find nodes match
source = next((transceivers.pop(uid) for uid in transceivers
if args.source.lower() in uid.lower()), None)
if args.destination and not destination:
# TODO code a more advanced regex to find nodes match
destination = next((transceivers.pop(uid) for uid in transceivers
if args.destination.lower() in uid.lower()), None)
# If no partial match or no source/destination provided pick random
if not source:
source = list(transceivers.values())[0]
del transceivers[source.uid]
if not destination:
destination = list(transceivers.values())[0]
_logger.info(f'source = {args.source!r}')
_logger.info(f'destination = {args.destination!r}')
params = {}
params['request_id'] = 0
params['trx_type'] = ''
params['trx_mode'] = ''
params['source'] = source.uid
params['destination'] = destination.uid
params['bidir'] = False
params['nodes_list'] = [destination.uid]
params['loose_list'] = ['strict']
params['format'] = ''
params['path_bandwidth'] = 0
params['effective_freq_slot'] = None
trx_params = trx_mode_params(equipment)
if args.power:
trx_params['power'] = db2lin(float(args.power)) * 1e-3
params.update(trx_params)
req = PathRequest(**params)
power_mode = equipment['Span']['default'].power_mode
print('\n'.join([f'Power mode is set to {power_mode}',
f'=> it can be modified in eqpt_config.json - Span']))
pref_ch_db = lin2db(req.power * 1e3) # reference channel power / span (SL=20dB)
pref_total_db = pref_ch_db + lin2db(req.nb_channel) # reference total power / span (SL=20dB)
build_network(network, equipment, pref_ch_db, pref_total_db)
path = compute_constrained_path(network, req)
spans = [s.params.length for s in path if isinstance(s, RamanFiber) or isinstance(s, Fiber)]
print(f'\nThere are {len(spans)} fiber spans over {sum(spans)/1000:.0f} km between {source.uid} '
f'and {destination.uid}')
print(f'\nNow propagating between {source.uid} and {destination.uid}:')
try:
p_start, p_stop, p_step = equipment['SI']['default'].power_range_db
p_num = abs(int(round((p_stop - p_start) / p_step))) + 1 if p_step != 0 else 1
power_range = list(linspace(p_start, p_stop, p_num))
except TypeError:
print('invalid power range definition in eqpt_config, should be power_range_db: [lower, upper, step]')
power_range = [0]
if not power_mode:
# power cannot be changed in gain mode
power_range = [0]
for dp_db in power_range:
req.power = db2lin(pref_ch_db + dp_db) * 1e-3
if power_mode:
print(f'\nPropagating with input power = {ansi_escapes.cyan}{lin2db(req.power*1e3):.2f} dBm{ansi_escapes.reset}:')
else:
print(f'\nPropagating in {ansi_escapes.cyan}gain mode{ansi_escapes.reset}: power cannot be set manually')
infos = propagate2(path, req, equipment)
if len(power_range) == 1:
for elem in path:
print(elem)
if power_mode:
print(f'\nTransmission result for input power = {lin2db(req.power*1e3):.2f} dBm:')
else:
print(f'\nTransmission results:')
print(f' Final SNR total (0.1 nm): {ansi_escapes.cyan}{mean(destination.snr_01nm):.02f} dB{ansi_escapes.reset}')
else:
print(path[-1])
# print(f'\n !!!!!!!!!!!!!!!!! TEST POINT !!!!!!!!!!!!!!!!!!!!!')
# print(f'carriers ase output of {path[1]} =\n {list(path[1].carriers("out", "nli"))}')
# => use "in" or "out" parameter
# => use "nli" or "ase" or "signal" or "total" parameter
if args.save_network is not None:
save_network(network, args.save_network)
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
if args.show_channels:
print('\nThe total SNR per channel at the end of the line is:')
print(
'{:>5}{:>26}{:>26}{:>28}{:>28}{:>28}' .format(
'Ch. #',
'Channel frequency (THz)',
'Channel power (dBm)',
'OSNR ASE (signal bw, dB)',
'SNR NLI (signal bw, dB)',
'SNR total (signal bw, dB)'))
for final_carrier, ch_osnr, ch_snr_nl, ch_snr in zip(
infos[path[-1]][1].carriers, path[-1].osnr_ase, path[-1].osnr_nli, path[-1].snr):
ch_freq = final_carrier.frequency * 1e-12
ch_power = lin2db(final_carrier.power.signal * 1e3)
print(
'{:5}{:26.2f}{:26.2f}{:28.2f}{:28.2f}{:28.2f}' .format(
final_carrier.channel_number, round(
ch_freq, 2), round(
ch_power, 2), round(
ch_osnr, 2), round(
ch_snr_nl, 2), round(
ch_snr, 2)))
if not args.source:
print(f'\n(No source node specified: picked {source.uid})')
elif not valid_source:
print(f'\n(Invalid source node {args.source!r} replaced with {source.uid})')
if not args.destination:
print(f'\n(No destination node specified: picked {destination.uid})')
elif not valid_destination:
print(f'\n(Invalid destination node {args.destination!r} replaced with {destination.uid})')
if args.plot:
plot_results(network, path, source, destination, infos)
def _path_result_json(pathresult):
return {'response': [n.json for n in pathresult]}
def path_requests_run(args=None):
parser = argparse.ArgumentParser(
description='Compute performance for a list of services provided in a json file or an excel sheet',
epilog=_help_footer,
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
)
_add_common_options(parser, network_default=_examples_dir / 'meshTopologyExampleV2.xls')
parser.add_argument('service_filename', nargs='?', type=Path, metavar='SERVICES-REQUESTS.(json|xls|xlsx)',
default=_examples_dir / 'meshTopologyExampleV2.xls',
help='Input service file')
parser.add_argument('-bi', '--bidir', action='store_true',
help='considers that all demands are bidir')
parser.add_argument('-o', '--output', type=Path, metavar=_help_fname_json_csv,
help='Store satisifed requests into a JSON or CSV file')
args = parser.parse_args(args if args is not None else sys.argv[1:])
_setup_logging(args)
_logger.info(f'Computing path requests {args.service_filename} into JSON format')
print(f'{ansi_escapes.blue}Computing path requests {os.path.relpath(args.service_filename)} into JSON format{ansi_escapes.reset}')
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
# Build the network once using the default power defined in SI in eqpt config
# TODO power density: db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
# spacing, f_min and f_max
p_db = equipment['SI']['default'].power_dbm
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
build_network(network, equipment, p_db, p_total_db)
if args.save_network is not None:
save_network(network, args.save_network)
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
oms_list = build_oms_list(network, equipment)
try:
data = load_requests(args.service_filename, equipment, bidir=args.bidir,
network=network, network_filename=args.topology)
rqs = requests_from_json(data, equipment)
except exceptions.ServiceError as e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
sys.exit(1)
# check that request ids are unique. Non unique ids, may
# mess the computation: better to stop the computation
all_ids = [r.request_id for r in rqs]
if len(all_ids) != len(set(all_ids)):
for item in list(set(all_ids)):
all_ids.remove(item)
msg = f'Requests id {all_ids} are not unique'
_logger.critical(msg)
sys.exit()
rqs = correct_json_route_list(network, rqs)
# pths = compute_path(network, equipment, rqs)
dsjn = disjunctions_from_json(data)
print(f'{ansi_escapes.blue}List of disjunctions{ansi_escapes.reset}')
print(dsjn)
# need to warn or correct in case of wrong disjunction form
# disjunction must not be repeated with same or different ids
dsjn = deduplicate_disjunctions(dsjn)
# Aggregate demands with same exact constraints
print(f'{ansi_escapes.blue}Aggregating similar requests{ansi_escapes.reset}')
rqs, dsjn = requests_aggregation(rqs, dsjn)
# TODO export novel set of aggregated demands in a json file
print(f'{ansi_escapes.blue}The following services have been requested:{ansi_escapes.reset}')
print(rqs)
print(f'{ansi_escapes.blue}Computing all paths with constraints{ansi_escapes.reset}')
try:
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
except exceptions.DisjunctionError as this_e:
print(f'{ansi_escapes.red}Disjunction error:{ansi_escapes.reset} {this_e}')
sys.exit(1)
print(f'{ansi_escapes.blue}Propagating on selected path{ansi_escapes.reset}')
propagatedpths, reversed_pths, reversed_propagatedpths = compute_path_with_disjunction(network, equipment, rqs, pths)
# Note that deepcopy used in compute_path_with_disjunction returns
# a list of nodes which are not belonging to network (they are copies of the node objects).
# so there can not be propagation on these nodes.
pth_assign_spectrum(pths, rqs, oms_list, reversed_pths)
print(f'{ansi_escapes.blue}Result summary{ansi_escapes.reset}')
header = ['req id', ' demand', ' snr@bandwidth A-Z (Z-A)', ' snr@0.1nm A-Z (Z-A)',
' Receiver minOSNR', ' mode', ' Gbit/s', ' nb of tsp pairs',
'N,M or blocking reason']
data = []
data.append(header)
for i, this_p in enumerate(propagatedpths):
rev_pth = reversed_propagatedpths[i]
if rev_pth and this_p:
psnrb = f'{round(mean(this_p[-1].snr),2)} ({round(mean(rev_pth[-1].snr),2)})'
psnr = f'{round(mean(this_p[-1].snr_01nm), 2)}' +\
f' ({round(mean(rev_pth[-1].snr_01nm),2)})'
elif this_p:
psnrb = f'{round(mean(this_p[-1].snr),2)}'
psnr = f'{round(mean(this_p[-1].snr_01nm),2)}'
try:
if rqs[i].blocking_reason in BLOCKING_NOPATH:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} :',
f'-', f'-', f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
f'-', f'{rqs[i].blocking_reason}']
else:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
psnr, f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9, 2)}',
f'-', f'{rqs[i].blocking_reason}']
except AttributeError:
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
psnr, f'{rqs[i].OSNR}', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
f'{ceil(rqs[i].path_bandwidth / rqs[i].bit_rate) }', f'({rqs[i].N},{rqs[i].M})']
data.append(line)
col_width = max(len(word) for row in data for word in row[2:]) # padding
firstcol_width = max(len(row[0]) for row in data) # padding
secondcol_width = max(len(row[1]) for row in data) # padding
for row in data:
firstcol = ''.join(row[0].ljust(firstcol_width))
secondcol = ''.join(row[1].ljust(secondcol_width))
remainingcols = ''.join(word.center(col_width, ' ') for word in row[2:])
print(f'{firstcol} {secondcol} {remainingcols}')
print(f'{ansi_escapes.yellow}Result summary shows mean SNR and OSNR (average over all channels){ansi_escapes.reset}')
if args.output:
result = []
# assumes that list of rqs and list of propgatedpths have same order
for i, pth in enumerate(propagatedpths):
result.append(ResultElement(rqs[i], pth, reversed_propagatedpths[i]))
temp = _path_result_json(result)
if args.output.suffix.lower() == '.json':
save_json(temp, args.output)
print(f'{ansi_escapes.blue}Saved JSON to {args.output}{ansi_escapes.reset}')
elif args.output.suffix.lower() == '.csv':
with open(args.output, "w", encoding='utf-8') as fcsv:
jsontocsv(temp, equipment, fcsv)
print(f'{ansi_escapes.blue}Saved CSV to {args.output}{ansi_escapes.reset}')
else:
print(f'{ansi_escapes.red}Cannot save output: neither JSON nor CSV file{ansi_escapes.reset}')
sys.exit(1)

745
gnpy/tools/convert.py Executable file
View File

@@ -0,0 +1,745 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.tools.convert
==================
This module contains utilities for converting between XLS and JSON.
The input XLS file must contain sheets named "Nodes" and "Links".
It may optionally contain a sheet named "Eqpt".
In the "Nodes" sheet, only the "City" column is mandatory. The column "Type"
can be determined automatically given the topology (e.g., if degree 2, ILA;
otherwise, ROADM.) Incorrectly specified types (e.g., ILA for node of
degree ≠ 2) will be automatically corrected.
In the "Links" sheet, only the first three columns ("Node A", "Node Z" and
"east Distance (km)") are mandatory. Missing "west" information is copied from
the "east" information so that it is possible to input undirected data.
"""
from sys import exit
from xlrd import open_workbook
from argparse import ArgumentParser
from collections import namedtuple, Counter, defaultdict
from itertools import chain
from json import dumps
from pathlib import Path
from copy import copy
from gnpy.core import ansi_escapes
from gnpy.core.utils import silent_remove
from gnpy.core.exceptions import NetworkTopologyError
from gnpy.core.elements import Edfa, Fused, Fiber
def all_rows(sh, start=0):
return (sh.row(x) for x in range(start, sh.nrows))
class Node(object):
def __init__(self, **kwargs):
super(Node, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v = clean_kwargs.get(k, v)
setattr(self, k, v)
default_values = {
'city': '',
'state': '',
'country': '',
'region': '',
'latitude': 0,
'longitude': 0,
'node_type': 'ILA',
'booster_restriction': '',
'preamp_restriction': ''
}
class Link(object):
"""attribtes from west parse_ept_headers dict
+node_a, node_z, west_fiber_con_in, east_fiber_con_in
"""
def __init__(self, **kwargs):
super(Link, self).__init__()
self.update_attr(kwargs)
self.distance_units = 'km'
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v = clean_kwargs.get(k, v)
setattr(self, k, v)
k = 'west' + k.split('east')[-1]
v = clean_kwargs.get(k, v)
setattr(self, k, v)
def __eq__(self, link):
return (self.from_city == link.from_city and self.to_city == link.to_city) \
or (self.from_city == link.to_city and self.to_city == link.from_city)
default_values = {
'from_city': '',
'to_city': '',
'east_distance': 80,
'east_fiber': 'SSMF',
'east_lineic': 0.2,
'east_con_in': None,
'east_con_out': None,
'east_pmd': 0.1,
'east_cable': ''
}
class Eqpt(object):
def __init__(self, **kwargs):
super(Eqpt, self).__init__()
self.update_attr(kwargs)
def update_attr(self, kwargs):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in self.default_values.items():
v_east = clean_kwargs.get(k, v)
setattr(self, k, v_east)
k = 'west' + k.split('east')[-1]
v_west = clean_kwargs.get(k, v)
setattr(self, k, v_west)
default_values = {
'from_city': '',
'to_city': '',
'east_amp_type': '',
'east_att_in': 0,
'east_amp_gain': None,
'east_amp_dp': None,
'east_tilt': 0,
'east_att_out': None
}
def read_header(my_sheet, line, slice_):
""" return the list of headers !:= ''
header_i = [(header, header_column_index), ...]
in a {line, slice1_x, slice_y} range
"""
Param_header = namedtuple('Param_header', 'header colindex')
try:
header = [x.value.strip() for x in my_sheet.row_slice(line, slice_[0], slice_[1])]
header_i = [Param_header(header, i + slice_[0]) for i, header in enumerate(header) if header != '']
except Exception:
header_i = []
if header_i != [] and header_i[-1].colindex != slice_[1]:
header_i.append(Param_header('', slice_[1]))
return header_i
def read_slice(my_sheet, line, slice_, header):
"""return the slice range of a given header
in a defined range {line, slice_x, slice_y}"""
header_i = read_header(my_sheet, line, slice_)
slice_range = (-1, -1)
if header_i != []:
try:
slice_range = next((h.colindex, header_i[i + 1].colindex)
for i, h in enumerate(header_i) if header in h.header)
except Exception:
pass
return slice_range
def parse_headers(my_sheet, input_headers_dict, headers, start_line, slice_in):
"""return a dict of header_slice
key = column index
value = header name"""
for h0 in input_headers_dict:
slice_out = read_slice(my_sheet, start_line, slice_in, h0)
iteration = 1
while slice_out == (-1, -1) and iteration < 10:
# try next lines
slice_out = read_slice(my_sheet, start_line + iteration, slice_in, h0)
iteration += 1
if slice_out == (-1, -1):
if h0 in ('east', 'Node A', 'Node Z', 'City'):
print(f'{ansi_escapes.red}CRITICAL{ansi_escapes.reset}: missing _{h0}_ header: EXECUTION ENDS')
exit()
else:
print(f'missing header {h0}')
elif not isinstance(input_headers_dict[h0], dict):
headers[slice_out[0]] = input_headers_dict[h0]
else:
headers = parse_headers(my_sheet, input_headers_dict[h0], headers, start_line + 1, slice_out)
if headers == {}:
print(f'{ansi_escapes.red}CRITICAL ERROR{ansi_escapes.reset}: could not find any header to read _ ABORT')
exit()
return headers
def parse_row(row, headers):
return {f: r.value for f, r in
zip([label for label in headers.values()], [row[i] for i in headers])}
def parse_sheet(my_sheet, input_headers_dict, header_line, start_line, column):
headers = parse_headers(my_sheet, input_headers_dict, {}, header_line, (0, column))
for row in all_rows(my_sheet, start=start_line):
yield parse_row(row[0: column], headers)
def sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city):
duplicate_links = []
for l1 in links:
for l2 in links:
if l1 is not l2 and l1 == l2 and l2 not in duplicate_links:
print(f'\nWARNING\n \
link {l1.from_city}-{l1.to_city} is duplicate \
\nthe 1st duplicate link will be removed but you should check Links sheet input')
duplicate_links.append(l1)
for l in duplicate_links:
links.remove(l)
try:
test_nodes = [n for n in nodes_by_city if n not in links_by_city]
test_links = [n for n in links_by_city if n not in nodes_by_city]
test_eqpts = [n for n in eqpts_by_city if n not in nodes_by_city]
assert (test_nodes == [] or test_nodes == [''])\
and (test_links == [] or test_links == [''])\
and (test_eqpts == [] or test_eqpts == [''])
except AssertionError:
msg = f'CRITICAL error in excel input: Names in Nodes and Links sheets do no match, check:\
\n{test_nodes} in Nodes sheet\
\n{test_links} in Links sheet\
\n{test_eqpts} in Eqpt sheet'
raise NetworkTopologyError(msg)
for city, link in links_by_city.items():
if nodes_by_city[city].node_type.lower() == 'ila' and len(link) != 2:
# wrong input: ILA sites can only be Degree 2
# => correct to make it a ROADM and remove entry in links_by_city
# TODO: put in log rather than print
print(f'invalid node type ({nodes_by_city[city].node_type})\
specified in {city}, replaced by ROADM')
nodes_by_city[city].node_type = 'ROADM'
for n in nodes:
if n.city == city:
n.node_type = 'ROADM'
return nodes, links
def xls_to_json_data(input_filename, filter_region=[]):
nodes, links, eqpts = parse_excel(input_filename)
if filter_region:
nodes = [n for n in nodes if n.region.lower() in filter_region]
cities = {n.city for n in nodes}
links = [lnk for lnk in links if lnk.from_city in cities and
lnk.to_city in cities]
cities = {lnk.from_city for lnk in links} | {lnk.to_city for lnk in links}
nodes = [n for n in nodes if n.city in cities]
global nodes_by_city
nodes_by_city = {n.city: n for n in nodes}
global links_by_city
links_by_city = defaultdict(list)
for link in links:
links_by_city[link.from_city].append(link)
links_by_city[link.to_city].append(link)
global eqpts_by_city
eqpts_by_city = defaultdict(list)
for eqpt in eqpts:
eqpts_by_city[eqpt.from_city].append(eqpt)
nodes, links = sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city)
return {
'elements':
[{'uid': f'trx {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Transceiver'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'] +
[{'uid': f'roadm {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'
and x.booster_restriction == '' and x.preamp_restriction == ''] +
[{'uid': f'roadm {x.city}',
'params': {
'restrictions': {
'preamp_variety_list': silent_remove(x.preamp_restriction.split(' | '), ''),
'booster_variety_list': silent_remove(x.booster_restriction.split(' | '), '')
}
},
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Roadm'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm' and
(x.booster_restriction != '' or x.preamp_restriction != '')] +
[{'uid': f'west fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'east fused spans in {x.city}',
'metadata': {'location': {'city': x.city,
'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude}},
'type': 'Fused'}
for x in nodes_by_city.values() if x.node_type.lower() == 'fused'] +
[{'uid': f'fiber ({x.from_city} \u2192 {x.to_city})-{x.east_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.east_fiber,
'params': {'length': round(x.east_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.east_lineic,
'con_in': x.east_con_in,
'con_out': x.east_con_out}
}
for x in links] +
[{'uid': f'fiber ({x.to_city} \u2192 {x.from_city})-{x.west_cable}',
'metadata': {'location': midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber',
'type_variety': x.west_fiber,
'params': {'length': round(x.west_distance, 3),
'length_units': x.distance_units,
'loss_coef': x.west_lineic,
'con_in': x.west_con_in,
'con_out': x.west_con_out}
} # missing ILA construction
for x in links] +
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.east_amp_type,
'operational': {'gain_target': e.east_amp_gain,
'delta_p': e.east_amp_dp,
'tilt_target': e.east_tilt,
'out_voa': e.east_att_out}
}
for e in eqpts if (e.east_amp_type.lower() != '' and \
e.east_amp_type.lower() != 'fused')] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Edfa',
'type_variety': e.west_amp_type,
'operational': {'gain_target': e.west_amp_gain,
'delta_p': e.west_amp_dp,
'tilt_target': e.west_tilt,
'out_voa': e.west_att_out}
}
for e in eqpts if (e.west_amp_type.lower() != '' and \
e.west_amp_type.lower() != 'fused')] +
# fused edfa variety is a hack to indicate that there should not be
# booster amplifier out the roadm.
# If user specifies ILA in Nodes sheet and fused in Eqpt sheet, then assumes that
# this is a fused nodes.
[{'uid': f'east edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.east_amp_type.lower() == 'fused'] +
[{'uid': f'west edfa in {e.from_city} to {e.to_city}',
'metadata': {'location': {'city': nodes_by_city[e.from_city].city,
'region': nodes_by_city[e.from_city].region,
'latitude': nodes_by_city[e.from_city].latitude,
'longitude': nodes_by_city[e.from_city].longitude}},
'type': 'Fused',
'params': {'loss': 0}
}
for e in eqpts if e.west_amp_type.lower() == 'fused'],
'connections':
list(chain.from_iterable([eqpt_connection_by_city(n.city)
for n in nodes]))
+
list(chain.from_iterable(zip(
[{'from_node': f'trx {x.city}',
'to_node': f'roadm {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'],
[{'from_node': f'roadm {x.city}',
'to_node': f'trx {x.city}'}
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'])))
}
def convert_file(input_filename, filter_region=[], output_json_file_name=None):
data = xls_to_json_data(input_filename, filter_region)
if output_json_file_name is None:
output_json_file_name = input_filename.with_suffix('.json')
with open(output_json_file_name, 'w', encoding='utf-8') as edfa_json_file:
edfa_json_file.write(dumps(data, indent=2, ensure_ascii=False))
return output_json_file_name
def corresp_names(input_filename, network):
""" a function that builds the correspondance between names given in the excel,
and names used in the json, and created by the autodesign.
All names are listed
"""
nodes, links, eqpts = parse_excel(input_filename)
fused = [n.uid for n in network.nodes() if isinstance(n, Fused)]
ila = [n.uid for n in network.nodes() if isinstance(n, Edfa)]
corresp_roadm = {x.city: [f'roadm {x.city}'] for x in nodes
if x.node_type.lower() == 'roadm'}
corresp_fused = {x.city: [f'west fused spans in {x.city}', f'east fused spans in {x.city}']
for x in nodes if x.node_type.lower() == 'fused' and
f'west fused spans in {x.city}' in fused and
f'east fused spans in {x.city}' in fused}
# add the special cases when an ila is changed into a fused
for my_e in eqpts:
name = f'east edfa in {my_e.from_city} to {my_e.to_city}'
if my_e.east_amp_type.lower() == 'fused' and name in fused:
if my_e.from_city in corresp_fused.keys():
corresp_fused[my_e.from_city].append(name)
else:
corresp_fused[my_e.from_city] = [name]
name = f'west edfa in {my_e.from_city} to {my_e.to_city}'
if my_e.west_amp_type.lower() == 'fused' and name in fused:
if my_e.from_city in corresp_fused.keys():
corresp_fused[my_e.from_city].append(name)
else:
corresp_fused[my_e.from_city] = [name]
# build corresp ila based on eqpt sheet
# start with east direction
corresp_ila = {e.from_city: [f'east edfa in {e.from_city} to {e.to_city}']
for e in eqpts if e.east_amp_type.lower() != '' and
f'east edfa in {e.from_city} to {e.to_city}' in ila}
# west direction, append name or create a new item in dict
for my_e in eqpts:
if my_e.west_amp_type.lower() != '':
name = f'west edfa in {my_e.from_city} to {my_e.to_city}'
if name in ila:
if my_e.from_city in corresp_ila.keys():
corresp_ila[my_e.from_city].append(name)
else:
corresp_ila[my_e.from_city] = [name]
# complete with potential autodesign names: amplifiers
for my_l in links:
name = f'Edfa0_fiber ({my_l.to_city} \u2192 {my_l.from_city})-{my_l.west_cable}'
if name in ila:
if my_l.from_city in corresp_ila.keys():
# "east edfa in Stbrieuc to Rennes_STA" is equivalent name as
# "Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056"
# "west edfa in Stbrieuc to Rennes_STA" is equivalent name as
# "Edfa0_fiber (Rennes_STA → Stbrieuc)-F057"
# does not filter names: all types (except boosters) are created.
# in case fibers are splitted the name here is a prefix
corresp_ila[my_l.from_city].append(name)
else:
corresp_ila[my_l.from_city] = [name]
name = f'Edfa0_fiber ({my_l.from_city} \u2192 {my_l.to_city})-{my_l.east_cable}'
if name in ila:
if my_l.to_city in corresp_ila.keys():
corresp_ila[my_l.to_city].append(name)
else:
corresp_ila[my_l.to_city] = [name]
# merge fused with ila:
for key, val in corresp_fused.items():
if key in corresp_ila.keys():
corresp_ila[key].extend(val)
else:
corresp_ila[key] = val
# no need of roadm booster
return corresp_roadm, corresp_fused, corresp_ila
def parse_excel(input_filename):
link_headers = {
'Node A': 'from_city',
'Node Z': 'to_city',
'east': {
'Distance (km)': 'east_distance',
'Fiber type': 'east_fiber',
'lineic att': 'east_lineic',
'Con_in': 'east_con_in',
'Con_out': 'east_con_out',
'PMD': 'east_pmd',
'Cable id': 'east_cable'
},
'west': {
'Distance (km)': 'west_distance',
'Fiber type': 'west_fiber',
'lineic att': 'west_lineic',
'Con_in': 'west_con_in',
'Con_out': 'west_con_out',
'PMD': 'west_pmd',
'Cable id': 'west_cable'
}
}
node_headers = {
'City': 'city',
'State': 'state',
'Country': 'country',
'Region': 'region',
'Latitude': 'latitude',
'Longitude': 'longitude',
'Type': 'node_type',
'Booster_restriction': 'booster_restriction',
'Preamp_restriction': 'preamp_restriction'
}
eqpt_headers = {
'Node A': 'from_city',
'Node Z': 'to_city',
'east': {
'amp type': 'east_amp_type',
'att_in': 'east_att_in',
'amp gain': 'east_amp_gain',
'delta p': 'east_amp_dp',
'tilt': 'east_tilt',
'att_out': 'east_att_out'
},
'west': {
'amp type': 'west_amp_type',
'att_in': 'west_att_in',
'amp gain': 'west_amp_gain',
'delta p': 'west_amp_dp',
'tilt': 'west_tilt',
'att_out': 'west_att_out'
}
}
with open_workbook(input_filename) as wb:
nodes_sheet = wb.sheet_by_name('Nodes')
links_sheet = wb.sheet_by_name('Links')
try:
eqpt_sheet = wb.sheet_by_name('Eqpt')
except Exception:
# eqpt_sheet is optional
eqpt_sheet = None
nodes = []
for node in parse_sheet(nodes_sheet, node_headers, NODES_LINE, NODES_LINE + 1, NODES_COLUMN):
nodes.append(Node(**node))
expected_node_types = {'ROADM', 'ILA', 'FUSED'}
for n in nodes:
if n.node_type not in expected_node_types:
n.node_type = 'ILA'
links = []
for link in parse_sheet(links_sheet, link_headers, LINKS_LINE, LINKS_LINE + 2, LINKS_COLUMN):
links.append(Link(**link))
eqpts = []
if eqpt_sheet is not None:
for eqpt in parse_sheet(eqpt_sheet, eqpt_headers, EQPTS_LINE, EQPTS_LINE + 2, EQPTS_COLUMN):
eqpts.append(Eqpt(**eqpt))
# sanity check
all_cities = Counter(n.city for n in nodes)
if len(all_cities) != len(nodes):
raise ValueError(f'Duplicate city: {all_cities}')
bad_links = []
for lnk in links:
if lnk.from_city not in all_cities or lnk.to_city not in all_cities:
bad_links.append([lnk.from_city, lnk.to_city])
if bad_links:
raise NetworkTopologyError(f'Bad link(s): {bad_links}.')
return nodes, links, eqpts
def eqpt_connection_by_city(city_name):
other_cities = fiber_dest_from_source(city_name)
subdata = []
if nodes_by_city[city_name].node_type.lower() in {'ila', 'fused'}:
# Then len(other_cities) == 2
direction = ['west', 'east']
for i in range(2):
from_ = fiber_link(other_cities[i], city_name)
in_ = eqpt_in_city_to_city(city_name, other_cities[0], direction[i])
to_ = fiber_link(city_name, other_cities[1 - i])
subdata += connect_eqpt(from_, in_, to_)
elif nodes_by_city[city_name].node_type.lower() == 'roadm':
for other_city in other_cities:
from_ = f'roadm {city_name}'
in_ = eqpt_in_city_to_city(city_name, other_city)
to_ = fiber_link(city_name, other_city)
subdata += connect_eqpt(from_, in_, to_)
from_ = fiber_link(other_city, city_name)
in_ = eqpt_in_city_to_city(city_name, other_city, "west")
to_ = f'roadm {city_name}'
subdata += connect_eqpt(from_, in_, to_)
return subdata
def connect_eqpt(from_, in_, to_):
connections = []
if in_ != '':
connections = [{'from_node': from_, 'to_node': in_},
{'from_node': in_, 'to_node': to_}]
else:
connections = [{'from_node': from_, 'to_node': to_}]
return connections
def eqpt_in_city_to_city(in_city, to_city, direction='east'):
rev_direction = 'west' if direction == 'east' else 'east'
amp_direction = f'{direction}_amp_type'
amp_rev_direction = f'{rev_direction}_amp_type'
return_eqpt = ''
if in_city in eqpts_by_city:
for e in eqpts_by_city[in_city]:
if nodes_by_city[in_city].node_type.lower() == 'roadm':
if e.to_city == to_city and getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
elif nodes_by_city[in_city].node_type.lower() == 'ila':
if e.to_city != to_city:
direction = rev_direction
amp_direction = amp_rev_direction
if getattr(e, amp_direction) != '':
return_eqpt = f'{direction} edfa in {e.from_city} to {e.to_city}'
if nodes_by_city[in_city].node_type.lower() == 'fused':
return_eqpt = f'{direction} fused spans in {in_city}'
return return_eqpt
def corresp_next_node(network, corresp_ila, corresp_roadm):
""" for each name in corresp dictionnaries find the next node in network and its name
given by user in excel. for meshTopology_exampleV2.xls:
user ILA name Stbrieuc covers the two direction. convert.py creates 2 different ILA
with possible names (depending on the direction and if the eqpt was defined in eqpt
sheet)
- east edfa in Stbrieuc to Rennes_STA
- west edfa in Stbrieuc to Rennes_STA
- Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056
- Edfa0_fiber (Rennes_STA → Stbrieuc)-F057
next_nodes finds the user defined name of next node to be able to map the path constraints
- east edfa in Stbrieuc to Rennes_STA next node = Rennes_STA
- west edfa in Stbrieuc to Rennes_STA next node Lannion_CAS
Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056 and Edfa0_fiber (Rennes_STA → Stbrieuc)-F057
do not exist
the function supports fiber splitting, fused nodes and shall only be called if
excel format is used for both network and service
"""
next_node = {}
# consolidate tables and create next_node table
for ila_key, ila_list in corresp_ila.items():
temp = copy(ila_list)
for ila_elem in ila_list:
# find the node with ila_elem string _in_ the node uid. 'in' is used instead of
# '==' to find composed nodes due to fiber splitting in autodesign.
# eg if elem_ila is 'Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056',
# node uid 'Edfa0_fiber (Lannion_CAS → Stbrieuc)-F056_(1/2)' is possible
correct_ila_name = next(n.uid for n in network.nodes() if ila_elem in n.uid)
temp.remove(ila_elem)
temp.append(correct_ila_name)
ila_nd = next(n for n in network.nodes() if ila_elem in n.uid)
next_nd = next(network.successors(ila_nd))
# search for the next ILA or ROADM
while isinstance(next_nd, (Fiber, Fused)):
next_nd = next(network.successors(next_nd))
# if next_nd is a ROADM, add the first found correspondance
for key, val in corresp_roadm.items():
# val is a list of possible names associated with key
if next_nd.uid in val:
next_node[correct_ila_name] = key
break
# if next_nd was not already added in the dict with the previous loop,
# add the first found correspondance in ila names
if correct_ila_name not in next_node.keys():
for key, val in corresp_ila.items():
# in case of splitted fibers the ila name might not be exact match
if [e for e in val if e in next_nd.uid]:
next_node[correct_ila_name] = key
break
corresp_ila[ila_key] = temp
return corresp_ila, next_node
def fiber_dest_from_source(city_name):
destinations = []
links_from_city = links_by_city[city_name]
for l in links_from_city:
if l.from_city == city_name:
destinations.append(l.to_city)
else:
destinations.append(l.from_city)
return destinations
def fiber_link(from_city, to_city):
source_dest = (from_city, to_city)
links = links_by_city[from_city]
link = next(l for l in links if l.from_city in source_dest and l.to_city in source_dest)
if link.from_city == from_city:
fiber = f'fiber ({link.from_city} \u2192 {link.to_city})-{link.east_cable}'
else:
fiber = f'fiber ({link.to_city} \u2192 {link.from_city})-{link.west_cable}'
return fiber
def midpoint(city_a, city_b):
lats = city_a.latitude, city_b.latitude
longs = city_a.longitude, city_b.longitude
try:
result = {
'latitude': sum(lats) / 2,
'longitude': sum(longs) / 2
}
except TypeError:
result = {
'latitude': 0,
'longitude': 0
}
return result
# TODO get column size automatically from tupple size
NODES_COLUMN = 10
NODES_LINE = 4
LINKS_COLUMN = 16
LINKS_LINE = 3
EQPTS_LINE = 3
EQPTS_COLUMN = 14
def _do_convert():
parser = ArgumentParser()
parser.add_argument('workbook', type=Path)
parser.add_argument('-f', '--filter-region', action='append', default=[])
parser.add_argument('--output', type=Path, help='Name of the generated JSON file')
args = parser.parse_args()
res = convert_file(args.workbook, args.filter_region, args.output)
print(f'XLS -> JSON saved to {res}')
if __name__ == '__main__':
_do_convert()

544
gnpy/tools/json_io.py Normal file
View File

@@ -0,0 +1,544 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.json_io
==================
Loading and saving data from JSON files in GNPy's internal data format
'''
from networkx import DiGraph
from logging import getLogger
from pathlib import Path
import json
from collections import namedtuple
from gnpy.core import ansi_escapes, elements
from gnpy.core.equipment import trx_mode_params
from gnpy.core.exceptions import ConfigurationError, EquipmentConfigError, NetworkTopologyError, ServiceError
from gnpy.core.science_utils import estimate_nf_model
from gnpy.core.utils import automatic_nch, automatic_fmax, merge_amplifier_restrictions
from gnpy.topology.request import PathRequest, Disjunction
from gnpy.tools.convert import xls_to_json_data
from gnpy.tools.service_sheet import read_service_sheet
import time
_logger = getLogger(__name__)
Model_vg = namedtuple('Model_vg', 'nf1 nf2 delta_p')
Model_fg = namedtuple('Model_fg', 'nf0')
Model_openroadm = namedtuple('Model_openroadm', 'nf_coef')
Model_hybrid = namedtuple('Model_hybrid', 'nf_ram gain_ram edfa_variety')
Model_dual_stage = namedtuple('Model_dual_stage', 'preamp_variety booster_variety')
class _JsonThing:
def update_attr(self, default_values, kwargs, name):
clean_kwargs = {k: v for k, v in kwargs.items() if v != ''}
for k, v in default_values.items():
setattr(self, k, clean_kwargs.get(k, v))
if k not in clean_kwargs and name != 'Amp':
print(ansi_escapes.red +
f'\n WARNING missing {k} attribute in eqpt_config.json[{name}]' +
f'\n default value is {k} = {v}' +
ansi_escapes.reset)
time.sleep(1)
class SI(_JsonThing):
default_values = {
"f_min": 191.35e12,
"f_max": 196.1e12,
"baud_rate": 32e9,
"spacing": 50e9,
"power_dbm": 0,
"power_range_db": [0, 0, 0.5],
"roll_off": 0.15,
"tx_osnr": 45,
"sys_margins": 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'SI')
class Span(_JsonThing):
default_values = {
'power_mode': True,
'delta_power_range_db': None,
'max_fiber_lineic_loss_for_raman': 0.25,
'target_extended_gain': 2.5,
'max_length': 150,
'length_units': 'km',
'max_loss': None,
'padding': 10,
'EOL': 0,
'con_in': 0,
'con_out': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Span')
class Roadm(_JsonThing):
default_values = {
'target_pch_out_db': -17,
'add_drop_osnr': 100,
'pmd': 0,
'restrictions': {
'preamp_variety_list': [],
'booster_variety_list': []
}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Roadm')
class Transceiver(_JsonThing):
default_values = {
'type_variety': None,
'frequency': None,
'mode': {}
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Transceiver')
class Fiber(_JsonThing):
default_values = {
'type_variety': '',
'dispersion': None,
'gamma': 0,
'pmd_coef': 0
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Fiber')
class RamanFiber(_JsonThing):
default_values = {
'type_variety': '',
'dispersion': None,
'gamma': 0,
'pmd_coef': 0,
'raman_efficiency': None
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'RamanFiber')
for param in ('cr', 'frequency_offset'):
if param not in self.raman_efficiency:
raise EquipmentConfigError(f'RamanFiber.raman_efficiency: missing "{param}" parameter')
if self.raman_efficiency['frequency_offset'] != sorted(self.raman_efficiency['frequency_offset']):
raise EquipmentConfigError(f'RamanFiber.raman_efficiency.frequency_offset is not sorted')
class Amp(_JsonThing):
default_values = {
'f_min': 191.35e12,
'f_max': 196.1e12,
'type_variety': '',
'type_def': '',
'gain_flatmax': None,
'gain_min': None,
'p_max': None,
'nf_model': None,
'dual_stage_model': None,
'nf_fit_coeff': None,
'nf_ripple': None,
'dgt': None,
'gain_ripple': None,
'out_voa_auto': False,
'allowed_for_design': False,
'raman': False
}
def __init__(self, **kwargs):
self.update_attr(self.default_values, kwargs, 'Amp')
@classmethod
def from_json(cls, filename, **kwargs):
config = Path(filename).parent / 'default_edfa_config.json'
type_variety = kwargs['type_variety']
type_def = kwargs.get('type_def', 'variable_gain') # default compatibility with older json eqpt files
nf_def = None
dual_stage_def = None
if type_def == 'fixed_gain':
try:
nf0 = kwargs.pop('nf0')
except KeyError: # nf0 is expected for a fixed gain amp
raise EquipmentConfigError(f'missing nf0 value input for amplifier: {type_variety} in equipment config')
for k in ('nf_min', 'nf_max'):
try:
del kwargs[k]
except KeyError:
pass
nf_def = Model_fg(nf0)
elif type_def == 'advanced_model':
config = Path(filename).parent / kwargs.pop('advanced_config_from_json')
elif type_def == 'variable_gain':
gain_min, gain_max = kwargs['gain_min'], kwargs['gain_flatmax']
try: # nf_min and nf_max are expected for a variable gain amp
nf_min = kwargs.pop('nf_min')
nf_max = kwargs.pop('nf_max')
except KeyError:
raise EquipmentConfigError(f'missing nf_min or nf_max value input for amplifier: {type_variety} in equipment config')
try: # remove all remaining nf inputs
del kwargs['nf0']
except KeyError:
pass # nf0 is not needed for variable gain amp
nf1, nf2, delta_p = estimate_nf_model(type_variety, gain_min, gain_max, nf_min, nf_max)
nf_def = Model_vg(nf1, nf2, delta_p)
elif type_def == 'openroadm':
try:
nf_coef = kwargs.pop('nf_coef')
except KeyError: # nf_coef is expected for openroadm amp
raise EquipmentConfigError(f'missing nf_coef input for amplifier: {type_variety} in equipment config')
nf_def = Model_openroadm(nf_coef)
elif type_def == 'dual_stage':
try: # nf_ram and gain_ram are expected for a hybrid amp
preamp_variety = kwargs.pop('preamp_variety')
booster_variety = kwargs.pop('booster_variety')
except KeyError:
raise EquipmentConfigError(f'missing preamp/booster variety input for amplifier: {type_variety} in equipment config')
dual_stage_def = Model_dual_stage(preamp_variety, booster_variety)
json_data = load_json(config)
return cls(**{**kwargs, **json_data,
'nf_model': nf_def, 'dual_stage_model': dual_stage_def})
def _automatic_spacing(baud_rate):
"""return the min possible channel spacing for a given baud rate"""
# TODO : this should parametrized in a cfg file
# list of possible tuples [(max_baud_rate, spacing_for_this_baud_rate)]
spacing_list = [(33e9, 37.5e9), (38e9, 50e9), (50e9, 62.5e9), (67e9, 75e9), (92e9, 100e9)]
return min((s[1] for s in spacing_list if s[0] > baud_rate), default=baud_rate * 1.2)
def load_equipment(filename):
json_data = load_json(filename)
return _equipment_from_json(json_data, filename)
def _update_trx_osnr(equipment):
"""add sys_margins to all Transceivers OSNR values"""
for trx in equipment['Transceiver'].values():
for m in trx.mode:
m['OSNR'] = m['OSNR'] + equipment['SI']['default'].sys_margins
return equipment
def _update_dual_stage(equipment):
edfa_dict = equipment['Edfa']
for edfa in edfa_dict.values():
if edfa.type_def == 'dual_stage':
edfa_preamp = edfa_dict[edfa.dual_stage_model.preamp_variety]
edfa_booster = edfa_dict[edfa.dual_stage_model.booster_variety]
for key, value in edfa_preamp.__dict__.items():
attr_k = 'preamp_' + key
setattr(edfa, attr_k, value)
for key, value in edfa_booster.__dict__.items():
attr_k = 'booster_' + key
setattr(edfa, attr_k, value)
edfa.p_max = edfa_booster.p_max
edfa.gain_flatmax = edfa_booster.gain_flatmax + edfa_preamp.gain_flatmax
if edfa.gain_min < edfa_preamp.gain_min:
raise EquipmentConfigError(f'Dual stage {edfa.type_variety} minimal gain is lower than its preamp minimal gain')
return equipment
def _roadm_restrictions_sanity_check(equipment):
""" verifies that booster and preamp restrictions specified in roadm equipment are listed
in the edfa.
"""
restrictions = equipment['Roadm']['default'].restrictions['booster_variety_list'] + \
equipment['Roadm']['default'].restrictions['preamp_variety_list']
for amp_name in restrictions:
if amp_name not in equipment['Edfa']:
raise EquipmentConfigError(f'ROADM restriction {amp_name} does not refer to a defined EDFA name')
def _equipment_from_json(json_data, filename):
"""build global dictionnary eqpt_library that stores all eqpt characteristics:
edfa type type_variety, fiber type_variety
from the eqpt_config.json (filename parameter)
also read advanced_config_from_json file parameters for edfa if they are available:
typically nf_ripple, dfg gain ripple, dgt and nf polynomial nf_fit_coeff
if advanced_config_from_json file parameter is not present: use nf_model:
requires nf_min and nf_max values boundaries of the edfa gain range
"""
equipment = {}
for key, entries in json_data.items():
equipment[key] = {}
for entry in entries:
subkey = entry.get('type_variety', 'default')
if key == 'Edfa':
equipment[key][subkey] = Amp.from_json(filename, **entry)
elif key == 'Fiber':
equipment[key][subkey] = Fiber(**entry)
elif key == 'Span':
equipment[key][subkey] = Span(**entry)
elif key == 'Roadm':
equipment[key][subkey] = Roadm(**entry)
elif key == 'SI':
equipment[key][subkey] = SI(**entry)
elif key == 'Transceiver':
equipment[key][subkey] = Transceiver(**entry)
elif key == 'RamanFiber':
equipment[key][subkey] = RamanFiber(**entry)
else:
raise EquipmentConfigError(f'Unrecognized network element type "{key}"')
equipment = _update_trx_osnr(equipment)
equipment = _update_dual_stage(equipment)
_roadm_restrictions_sanity_check(equipment)
return equipment
def load_network(filename, equipment):
if filename.suffix.lower() in ('.xls', '.xlsx'):
json_data = xls_to_json_data(filename)
elif filename.suffix.lower() == '.json':
json_data = load_json(filename)
else:
raise ValueError(f'unsupported topology filename extension {filename.suffix.lower()}')
return network_from_json(json_data, equipment)
def save_network(network: DiGraph, filename: str):
'''Dump the network into a JSON file
:param network: network to work on
:param filename: file to write to
'''
save_json(network_to_json(network), filename)
def _cls_for(equipment_type):
if equipment_type == 'Edfa':
return elements.Edfa
if equipment_type == 'Fused':
return elements.Fused
elif equipment_type == 'Roadm':
return elements.Roadm
elif equipment_type == 'Transceiver':
return elements.Transceiver
elif equipment_type == 'Fiber':
return elements.Fiber
elif equipment_type == 'RamanFiber':
return elements.RamanFiber
else:
raise ConfigurationError(f'Unknown network equipment "{equipment_type}"')
def network_from_json(json_data, equipment):
# NOTE|dutc: we could use the following, but it would tie our data format
# too closely to the graph library
# from networkx import node_link_graph
g = DiGraph()
for el_config in json_data['elements']:
typ = el_config.pop('type')
variety = el_config.pop('type_variety', 'default')
cls = _cls_for(typ)
if typ == 'Fused':
# well, there's no variety for the 'Fused' node type
pass
elif variety in equipment[typ]:
extra_params = equipment[typ][variety]
temp = el_config.setdefault('params', {})
temp = merge_amplifier_restrictions(temp, extra_params.__dict__)
el_config['params'] = temp
el_config['type_variety'] = variety
elif typ in ['Edfa', 'Fiber', 'RamanFiber']: # catch it now because the code will crash later!
raise ConfigurationError(f'The {typ} of variety type {variety} was not recognized:'
'\nplease check it is properly defined in the eqpt_config json file')
el = cls(**el_config)
g.add_node(el)
nodes = {k.uid: k for k in g.nodes()}
for cx in json_data['connections']:
from_node, to_node = cx['from_node'], cx['to_node']
try:
if isinstance(nodes[from_node], elements.Fiber):
edge_length = nodes[from_node].params.length
else:
edge_length = 0.01
g.add_edge(nodes[from_node], nodes[to_node], weight=edge_length)
except KeyError:
raise NetworkTopologyError(f'can not find {from_node} or {to_node} defined in {cx}')
return g
def network_to_json(network):
data = {
'elements': [n.to_json for n in network]
}
connections = {
'connections': [{"from_node": n.uid,
"to_node": next_n.uid}
for n in network
for next_n in network.successors(n) if next_n is not None]
}
data.update(connections)
return data
def load_json(filename):
with open(filename, 'r', encoding='utf-8') as f:
data = json.load(f)
return data
def save_json(obj, filename):
with open(filename, 'w', encoding='utf-8') as f:
json.dump(obj, f, indent=2, ensure_ascii=False)
def load_requests(filename, eqpt, bidir, network, network_filename):
""" loads the requests from a json or an excel file into a data string
"""
if filename.suffix.lower() in ('.xls', '.xlsx'):
_logger.info('Automatically converting requests from XLS to JSON')
try:
return convert_service_sheet(filename, eqpt, network, network_filename=network_filename, bidir=bidir)
except ServiceError as this_e:
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {this_e}')
exit(1)
else:
return load_json(filename)
def requests_from_json(json_data, equipment):
"""Extract list of requests from data parsed from JSON"""
requests_list = []
for req in json_data['path-request']:
# init all params from request
params = {}
params['request_id'] = req['request-id']
params['source'] = req['source']
params['bidir'] = req['bidirectional']
params['destination'] = req['destination']
params['trx_type'] = req['path-constraints']['te-bandwidth']['trx_type']
if 'trx_mode' in req['path-constraints']['te-bandwidth'].keys():
params['trx_mode'] = req['path-constraints']['te-bandwidth']['trx_mode']
else:
params['trx_mode'] = None
params['format'] = params['trx_mode']
params['spacing'] = req['path-constraints']['te-bandwidth']['spacing']
try:
nd_list = req['explicit-route-objects']['route-object-include-exclude']
except KeyError:
nd_list = []
params['nodes_list'] = [n['num-unnum-hop']['node-id'] for n in nd_list]
params['loose_list'] = [n['num-unnum-hop']['hop-type'] for n in nd_list]
# recover trx physical param (baudrate, ...) from type and mode
# in trx_mode_params optical power is read from equipment['SI']['default'] and
# nb_channel is computed based on min max frequency and spacing
trx_params = trx_mode_params(equipment, params['trx_type'], params['trx_mode'], True)
params.update(trx_params)
# print(trx_params['min_spacing'])
# optical power might be set differently in the request. if it is indicated then the
# params['power'] is updated
try:
if req['path-constraints']['te-bandwidth']['output-power']:
params['power'] = req['path-constraints']['te-bandwidth']['output-power']
except KeyError:
pass
# same process for nb-channel
f_min = params['f_min']
f_max_from_si = params['f_max']
try:
if req['path-constraints']['te-bandwidth']['max-nb-of-channel'] is not None:
nch = req['path-constraints']['te-bandwidth']['max-nb-of-channel']
params['nb_channel'] = nch
spacing = params['spacing']
params['f_max'] = automatic_fmax(f_min, spacing, nch)
else:
params['nb_channel'] = automatic_nch(f_min, f_max_from_si, params['spacing'])
except KeyError:
params['nb_channel'] = automatic_nch(f_min, f_max_from_si, params['spacing'])
if 'effective-freq-slot' in req['path-constraints']['te-bandwidth']:
# temporarily reads only the first slot
params['effective_freq_slot'] = req['path-constraints']['te-bandwidth']['effective-freq-slot'][0]
else:
params['effective_freq_slot'] = None
_check_one_request(params, f_max_from_si)
try:
params['path_bandwidth'] = req['path-constraints']['te-bandwidth']['path_bandwidth']
except KeyError:
pass
requests_list.append(PathRequest(**params))
return requests_list
def _check_one_request(params, f_max_from_si):
"""Checks that the requested parameters are consistant (spacing vs nb channel vs transponder mode...)"""
f_min = params['f_min']
f_max = params['f_max']
max_recommanded_nb_channels = automatic_nch(f_min, f_max, params['spacing'])
if params['baud_rate'] is not None:
# implicitly means that a mode is defined with min_spacing
if params['min_spacing'] > params['spacing']:
msg = f'Request {params["request_id"]} has spacing below transponder ' +\
f'{params["trx_type"]} {params["trx_mode"]} min spacing value ' +\
f'{params["min_spacing"]*1e-9}GHz.\nComputation stopped'
print(msg)
_logger.critical(msg)
raise ServiceError(msg)
if f_max > f_max_from_si:
msg = f'''Requested channel number {params["nb_channel"]}, baud rate {params["baud_rate"]} GHz
and requested spacing {params["spacing"]*1e-9}GHz is not consistent with frequency range
{f_min*1e-12} THz, {f_max*1e-12} THz, min recommanded spacing {params["min_spacing"]*1e-9}GHz.
max recommanded nb of channels is {max_recommanded_nb_channels}.'''
_logger.critical(msg)
raise ServiceError(msg)
def disjunctions_from_json(json_data):
""" reads the disjunction requests from the json dict and create the list
of requested disjunctions for this set of requests
"""
disjunctions_list = []
try:
temp_test = json_data['synchronization']
except KeyError:
temp_test = []
if temp_test:
for snc in json_data['synchronization']:
params = {}
params['disjunction_id'] = snc['synchronization-id']
params['relaxable'] = snc['svec']['relaxable']
params['link_diverse'] = 'link' in snc['svec']['disjointness']
params['node_diverse'] = 'node' in snc['svec']['disjointness']
params['disjunctions_req'] = snc['svec']['request-id-number']
disjunctions_list.append(Disjunction(**params))
return disjunctions_list
def convert_service_sheet(
input_filename,
eqpt,
network,
network_filename=None,
output_filename='',
bidir=False,
filter_region=None):
if output_filename == '':
output_filename = f'{str(input_filename)[0:len(str(input_filename))-len(str(input_filename.suffixes[0]))]}_services.json'
data = read_service_sheet(input_filename, eqpt, network, network_filename, bidir, filter_region)
save_json(data, output_filename)
return data

82
gnpy/tools/plots.py Executable file
View File

@@ -0,0 +1,82 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
'''
gnpy.tools.plots
================
Graphs and plots usable form a CLI application
'''
from matplotlib.pyplot import show, axis, figure, title, text
from networkx import draw_networkx_nodes, draw_networkx_edges, draw_networkx_labels
from gnpy.core.elements import Transceiver
def plot_baseline(network):
edges = set(network.edges())
pos = {n: (n.lng, n.lat) for n in network.nodes()}
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
plot = draw_networkx_nodes(network, nodelist=network.nodes(), node_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
axis('off')
show()
def plot_results(network, path, source, destination, infos):
path_edges = set(zip(path[:-1], path[1:]))
edges = set(network.edges()) - path_edges
pos = {n: (n.lng, n.lat) for n in network.nodes()}
nodes = {}
for k, (x, y) in pos.items():
nodes.setdefault((round(x, 1), round(y, 1)), []).append(k)
labels = {n: n.location.city for n in network.nodes() if isinstance(n, Transceiver)}
city_labels = set(labels.values())
for n in network.nodes():
if n.location.city and n.location.city not in city_labels:
labels[n] = n.location.city
city_labels.add(n.location.city)
label_pos = pos
fig = figure()
kwargs = {'figure': fig, 'pos': pos}
all_nodes = [n for n in network.nodes() if n not in path]
plot = draw_networkx_nodes(network, nodelist=all_nodes, node_color='#ababab', node_size=50, **kwargs)
draw_networkx_nodes(network, nodelist=path, node_color='#ff0000', node_size=55, **kwargs)
draw_networkx_edges(network, edgelist=edges, edge_color='#ababab', **kwargs)
draw_networkx_edges(network, edgelist=path_edges, edge_color='#ff0000', **kwargs)
draw_networkx_labels(network, labels=labels, font_size=14, **{**kwargs, 'pos': label_pos})
title(f'Propagating from {source.loc.city} to {destination.loc.city}')
axis('off')
heading = 'Spectral Information\n\n'
textbox = text(0.85, 0.20, heading, fontsize=14, fontname='Ubuntu Mono',
verticalalignment='top', transform=fig.axes[0].transAxes,
bbox={'boxstyle': 'round', 'facecolor': 'wheat', 'alpha': 0.5})
msgs = {(x, y): heading + '\n\n'.join(str(n) for n in ns if n in path)
for (x, y), ns in nodes.items()}
def hover(event):
if event.xdata is None or event.ydata is None:
return
if fig.contains(event):
x, y = round(event.xdata, 1), round(event.ydata, 1)
if (x, y) in msgs:
textbox.set_text(msgs[x, y])
else:
textbox.set_text(heading)
fig.canvas.draw_idle()
fig.canvas.mpl_connect('motion_notify_event', hover)
show()

381
gnpy/tools/service_sheet.py Normal file
View File

@@ -0,0 +1,381 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
gnpy.tools.service_sheet
========================
XLS parser that can be called to create a JSON request file in accordance with
Yang model for requesting path computation.
See: draft-ietf-teas-yang-path-computation-01.txt
"""
from xlrd import open_workbook, XL_CELL_EMPTY
from collections import namedtuple
from logging import getLogger
from copy import deepcopy
from gnpy.core.utils import db2lin
from gnpy.core.exceptions import ServiceError
from gnpy.core.elements import Transceiver, Roadm, Edfa, Fiber
import gnpy.core.ansi_escapes as ansi_escapes
from gnpy.tools.convert import corresp_names, corresp_next_node
SERVICES_COLUMN = 12
def all_rows(sheet, start=0):
return (sheet.row(x) for x in range(start, sheet.nrows))
logger = getLogger(__name__)
class Request(namedtuple('Request', 'request_id source destination trx_type mode \
spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
def __new__(cls, request_id, source, destination, trx_type, mode=None, spacing=None, power=None, nb_channel=None, disjoint_from='', nodes_list=None, is_loose='', path_bandwidth=None):
return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)
class Element:
def __eq__(self, other):
return type(self) == type(other) and self.uid == other.uid
def __hash__(self):
return hash((type(self), self.uid))
class Request_element(Element):
def __init__(self, Request, equipment, bidir):
# request_id is str
# excel has automatic number formatting that adds .0 on integer values
# the next lines recover the pure int value, assuming this .0 is unwanted
self.request_id = correct_xlrd_int_to_str_reading(Request.request_id)
self.source = f'trx {Request.source}'
self.destination = f'trx {Request.destination}'
# TODO: the automatic naming generated by excel parser requires that source and dest name
# be a string starting with 'trx' : this is manually added here.
self.srctpid = f'trx {Request.source}'
self.dsttpid = f'trx {Request.destination}'
self.bidir = bidir
# test that trx_type belongs to eqpt_config.json
# if not replace it with a default
try:
if equipment['Transceiver'][Request.trx_type]:
self.trx_type = correct_xlrd_int_to_str_reading(Request.trx_type)
if Request.mode is not None:
Requestmode = correct_xlrd_int_to_str_reading(Request.mode)
if [mode for mode in equipment['Transceiver'][Request.trx_type].mode if mode['format'] == Requestmode]:
self.mode = Requestmode
else:
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Requestmode}\' in eqpt library \nComputation stopped.'
# print(msg)
logger.critical(msg)
raise ServiceError(msg)
else:
Requestmode = None
self.mode = Request.mode
except KeyError:
msg = f'Request Id: {self.request_id} - could not find tsp : \'{Request.trx_type}\' with mode: \'{Request.mode}\' in eqpt library \nComputation stopped.'
# print(msg)
logger.critical(msg)
raise ServiceError(msg)
# excel input are in GHz and dBm
if Request.spacing is not None:
self.spacing = Request.spacing * 1e9
else:
msg = f'Request {self.request_id} missing spacing: spacing is mandatory.\ncomputation stopped'
logger.critical(msg)
raise ServiceError(msg)
if Request.power is not None:
self.power = db2lin(Request.power) * 1e-3
else:
self.power = None
if Request.nb_channel is not None:
self.nb_channel = int(Request.nb_channel)
else:
self.nb_channel = None
value = correct_xlrd_int_to_str_reading(Request.disjoint_from)
self.disjoint_from = [n for n in value.split(' | ') if value]
self.nodes_list = []
if Request.nodes_list:
self.nodes_list = Request.nodes_list.split(' | ')
self.loose = 'LOOSE'
if Request.is_loose.lower() == 'no':
self.loose = 'STRICT'
self.path_bandwidth = None
if Request.path_bandwidth is not None:
self.path_bandwidth = Request.path_bandwidth * 1e9
else:
self.path_bandwidth = 0
uid = property(lambda self: repr(self))
@property
def pathrequest(self):
# Default assumption for bidir is False
req_dictionnary = {
'request-id': self.request_id,
'source': self.source,
'destination': self.destination,
'src-tp-id': self.srctpid,
'dst-tp-id': self.dsttpid,
'bidirectional': self.bidir,
'path-constraints': {
'te-bandwidth': {
'technology': 'flexi-grid',
'trx_type': self.trx_type,
'trx_mode': self.mode,
'effective-freq-slot': [{'N': None, 'M': None}],
'spacing': self.spacing,
'max-nb-of-channel': self.nb_channel,
'output-power': self.power
}
}
}
if self.nodes_list:
req_dictionnary['explicit-route-objects'] = {}
temp = {'route-object-include-exclude': [
{'explicit-route-usage': 'route-include-ero',
'index': self.nodes_list.index(node),
'num-unnum-hop': {
'node-id': f'{node}',
'link-tp-id': 'link-tp-id is not used',
'hop-type': f'{self.loose}',
}
}
for node in self.nodes_list]
}
req_dictionnary['explicit-route-objects'] = temp
if self.path_bandwidth is not None:
req_dictionnary['path-constraints']['te-bandwidth']['path_bandwidth'] = self.path_bandwidth
return req_dictionnary
@property
def pathsync(self):
if self.disjoint_from:
return {'synchronization-id': self.request_id,
'svec': {
'relaxable': 'false',
'disjointness': 'node link',
'request-id-number': [self.request_id] + [n for n in self.disjoint_from]
}
}
else:
return None
# TO-DO: avoid multiple entries with same synchronisation vectors
@property
def json(self):
return self.pathrequest, self.pathsync
def read_service_sheet(
input_filename,
eqpt,
network,
network_filename=None,
bidir=False,
filter_region=None):
""" converts a service sheet into a json structure
"""
if filter_region is None:
filter_region = []
if network_filename is None:
network_filename = input_filename
service = parse_excel(input_filename)
req = [Request_element(n, eqpt, bidir) for n in service]
req = correct_xls_route_list(network_filename, network, req)
# if there is no sync vector , do not write any synchronization
synchro = [n.json[1] for n in req if n.json[1] is not None]
if synchro:
data = {
'path-request': [n.json[0] for n in req],
'synchronization': synchro
}
else:
data = {
'path-request': [n.json[0] for n in req]
}
return data
def correct_xlrd_int_to_str_reading(v):
if not isinstance(v, str):
value = str(int(v))
if value.endswith('.0'):
value = value[:-2]
else:
value = v
return value
def parse_row(row, fieldnames):
return {f: r.value for f, r in zip(fieldnames, row[0:SERVICES_COLUMN])
if r.ctype != XL_CELL_EMPTY}
def parse_excel(input_filename):
with open_workbook(input_filename) as wb:
service_sheet = wb.sheet_by_name('Service')
services = list(parse_service_sheet(service_sheet))
return services
def parse_service_sheet(service_sheet):
""" reads each column according to authorized fieldnames. order is not important.
"""
logger.info(f'Validating headers on {service_sheet.name!r}')
# add a test on field to enable the '' field case that arises when columns on the
# right hand side are used as comments or drawing in the excel sheet
header = [x.value.strip() for x in service_sheet.row(4)[0:SERVICES_COLUMN]
if len(x.value.strip()) > 0]
# create a service_fieldname independant from the excel column order
# to be compatible with any version of the sheet
# the following dictionnary records the excel field names and the corresponding parameter's name
authorized_fieldnames = {
'route id': 'request_id', 'Source': 'source', 'Destination': 'destination',
'TRX type': 'trx_type', 'Mode': 'mode', 'System: spacing': 'spacing',
'System: input power (dBm)': 'power', 'System: nb of channels': 'nb_channel',
'routing: disjoint from': 'disjoint_from', 'routing: path': 'nodes_list',
'routing: is loose?': 'is_loose', 'path bandwidth': 'path_bandwidth'}
try:
service_fieldnames = [authorized_fieldnames[e] for e in header]
except KeyError:
msg = f'Malformed header on Service sheet: {header} field not in {authorized_fieldnames}'
logger.critical(msg)
raise ValueError(msg)
for row in all_rows(service_sheet, start=5):
yield Request(**parse_row(row[0:SERVICES_COLUMN], service_fieldnames))
def correct_xls_route_list(network_filename, network, pathreqlist):
""" prepares the format of route list of nodes to be consistant with nodes names:
remove wrong names, find correct names for ila, roadm and fused if the entry was
xls.
if it was not xls, all names in list should be exact name in the network.
"""
# first loads the base correspondance dict built with excel naming
corresp_roadm, corresp_fused, corresp_ila = corresp_names(network_filename, network)
# then correct dict names with names of the autodisign and find next_node name
# according to xls naming
corresp_ila, next_node = corresp_next_node(network, corresp_ila, corresp_roadm)
# finally correct constraints based on these dict
trxfibertype = [n.uid for n in network.nodes() if isinstance(n, (Transceiver, Fiber))]
roadmtype = [n.uid for n in network.nodes() if isinstance(n, Roadm)]
edfatype = [n.uid for n in network.nodes() if isinstance(n, Edfa)]
# TODO there is a problem of identification of fibers in case of parallel
# fibers between two adjacent roadms so fiber constraint is not supported
transponders = [n.uid for n in network.nodes() if isinstance(n, Transceiver)]
for pathreq in pathreqlist:
# first check that source and dest are transceivers
if pathreq.source not in transponders:
msg = f'{ansi_escapes.red}Request: {pathreq.request_id}: could not find' +\
f' transponder source : {pathreq.source}.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
if pathreq.destination not in transponders:
msg = f'{ansi_escapes.red}Request: {pathreq.request_id}: could not find' +\
f' transponder destination: {pathreq.destination}.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
# silently pop source and dest nodes from the list if they were added by the user as first
# and last elem in the constraints respectively. Other positions must lead to an error
# caught later on
if pathreq.nodes_list and pathreq.source == pathreq.nodes_list[0]:
pathreq.loose_list.pop(0)
pathreq.nodes_list.pop(0)
if pathreq.nodes_list and pathreq.destination == pathreq.nodes_list[-1]:
pathreq.loose_list.pop(-1)
pathreq.nodes_list.pop(-1)
# Then process user defined constraints with respect to automatic namings
temp = deepcopy(pathreq)
# This needs a temporary object since we may suppress/correct elements in the list
# during the process
for i, n_id in enumerate(temp.nodes_list):
# n_id must not be a transceiver and must not be a fiber (non supported, user
# can not enter fiber names in excel)
if n_id not in trxfibertype:
# check that n_id is in the node list, if not find a correspondance name
if n_id in roadmtype + edfatype:
nodes_suggestion = [n_id]
else:
# checks first roadm, fused, and ila in this order, because ila automatic name
# contain roadm names. If it is a fused node, next ila names might be correct
# suggestions, especially if following fibers were splitted and ila names
# created with the name of the fused node
if n_id in corresp_roadm.keys():
nodes_suggestion = corresp_roadm[n_id]
elif n_id in corresp_fused.keys():
nodes_suggestion = corresp_fused[n_id] + corresp_ila[n_id]
elif n_id in corresp_ila.keys():
nodes_suggestion = corresp_ila[n_id]
else:
nodes_suggestion = []
if nodes_suggestion:
try:
if len(nodes_suggestion) > 1:
# if there is more than one suggestion, we need to choose the direction
# we rely on the next node provided by the user for this purpose
new_n = next(n for n in nodes_suggestion
if n in next_node.keys() and next_node[n]
in temp.nodes_list[i:] + [pathreq.destination] and
next_node[n] not in temp.nodes_list[:i])
else:
new_n = nodes_suggestion[0]
if new_n != n_id:
# warns the user when the correct name is used only in verbose mode,
# eg 'a' is a roadm and correct name is 'roadm a' or when there was
# too much ambiguity, 'b' is an ila, its name can be:
# Edfa0_fiber (a → b)-xx if next node is c or
# Edfa0_fiber (c → b)-xx if next node is a
msg = f'{ansi_escapes.yellow}Invalid route node specified:' +\
f'\n\t\'{n_id}\', replaced with \'{new_n}\'{ansi_escapes.reset}'
logger.info(msg)
pathreq.nodes_list[pathreq.nodes_list.index(n_id)] = new_n
except StopIteration:
# shall not come in this case, unless requested direction does not exist
msg = f'{ansi_escapes.yellow}Invalid route specified {n_id}: could' +\
f' not decide on direction, skipped!.\nPlease add a valid' +\
f' direction in constraints (next neighbour node){ansi_escapes.reset}'
print(msg)
logger.info(msg)
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
if temp.loose_list[i] == 'LOOSE':
# if no matching can be found in the network just ignore this constraint
# if it is a loose constraint
# warns the user that this node is not part of the topology
msg = f'{ansi_escapes.yellow}Invalid node specified:\n\t\'{n_id}\'' +\
f', could not use it as constraint, skipped!{ansi_escapes.reset}'
print(msg)
logger.info(msg)
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
msg = f'{ansi_escapes.red}Could not find node:\n\t\'{n_id}\' in network' +\
f' topology. Strict constraint can not be applied.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
else:
if temp.loose_list[i] == 'LOOSE':
print(f'{ansi_escapes.yellow}Invalid route node specified:\n\t\'{n_id}\'' +
f' type is not supported as constraint with xls network input,' +
f' skipped!{ansi_escapes.reset}')
pathreq.loose_list.pop(pathreq.nodes_list.index(n_id))
pathreq.nodes_list.remove(n_id)
else:
msg = f'{ansi_escapes.red}Invalid route node specified \n\t\'{n_id}\'' +\
f' type is not supported as constraint with xls network input,' +\
f', Strict constraint can not be applied.{ansi_escapes.reset}'
logger.critical(msg)
raise ServiceError(msg)
return pathreqlist

View File

@@ -0,0 +1,3 @@
'''
Tracking :py:mod:`.request` for spectrum and their :py:mod:`.spectrum_assignment`.
'''

1172
gnpy/topology/request.py Normal file

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More