mirror of
https://github.com/Telecominfraproject/oopt-gnpy.git
synced 2025-11-02 02:57:52 +00:00
Merge changes from topic "moving-examples" into develop
* changes: Enable saving the network as converted from XLS Save either to JSON or to CSV, not to both CLI: Allow Raman in path_requests_run CLI: Unify handling of the network topology Remove unused variables CLI: show default values in --help Unify handling of the --equipment option in examples CLI: specify shared code options just once Tweak the --help output Remove unused statements XLS -> JSON conversion: add a nice program for this transmission_main_example: Do not write out a CSV file tests: don't clutter up the source dir with generated CSVs use load_json instead of open coding tests: Do not produce JSON files in the source tree Split JSON export from service XLS reading tests: Do not create JSON files in the source tree Do not always write out JSON data when reading XLS files Remove incomplete support for "fuzzy name matching" distribute example data along GNPy Do not create *_auto_design.json by default tests: remove something which looks like a path, but is not a valid path tests: show that the examples still work when directly invoked examples: add some additional descriptions help context Distribute our examples via setuptools tests: call our example entry points via functions examples: prepare for overriding sys.args examples: move path_requests_run to gnpy.tools examples: move transmission_main_example into gnpy.tools examples: use ansi_escapes examples: manual coding style tweaks examples: autopep8 -aaaaaaaaaa examples: autopep8
This commit is contained in:
@@ -1,3 +1,3 @@
|
||||
#!/bin/bash
|
||||
cp -nr /oopt-gnpy/examples /shared
|
||||
cp -nr /oopt-gnpy/gnpy/example-data /shared
|
||||
exec "$@"
|
||||
|
||||
@@ -15,7 +15,7 @@ if [[ $ALREADY_FOUND == 0 ]]; then
|
||||
# shared directory setup: do not clobber the real data
|
||||
mkdir trash
|
||||
cd trash
|
||||
docker run -it --rm --volume $(pwd):/shared ${IMAGE_NAME} ./transmission_main_example.py
|
||||
docker run -it --rm --volume $(pwd):/shared ${IMAGE_NAME} gnpy-transmission-example
|
||||
else
|
||||
echo "Image ${IMAGE_NAME}:${IMAGE_TAG} already available, will just update the other tags"
|
||||
fi
|
||||
|
||||
@@ -9,7 +9,7 @@ install: skip
|
||||
script:
|
||||
- python setup.py develop
|
||||
- pip install pytest-cov rstcheck
|
||||
- pytest --cov-report=xml --cov=gnpy --cov=examples -v
|
||||
- pytest --cov-report=xml --cov=gnpy -v
|
||||
- rstcheck --ignore-roles cite --ignore-directives automodule --recursive --ignore-messages '(Duplicate explicit target name.*)' .
|
||||
- sphinx-build -W --keep-going docs/ x-throwaway-location
|
||||
after_success:
|
||||
|
||||
@@ -3,6 +3,6 @@ COPY . /oopt-gnpy
|
||||
WORKDIR /oopt-gnpy
|
||||
RUN apt update; apt install -y git
|
||||
RUN pip install .
|
||||
WORKDIR /shared/examples
|
||||
WORKDIR /shared/example-data
|
||||
ENTRYPOINT ["/oopt-gnpy/.docker-entry.sh"]
|
||||
CMD ["/bin/bash"]
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
How to prepare the Excel input file
|
||||
-----------------------------------
|
||||
|
||||
`examples/transmission_main_example.py <examples/transmission_main_example.py>`_ gives the possibility to use an excel input file instead of a json file. The program then will generate the corresponding json file for you.
|
||||
``gnpy-transmission-example`` gives the possibility to use an excel input file instead of a json file. The program then will generate the corresponding json file for you.
|
||||
|
||||
The file named 'meshTopologyExampleV2.xls' is an example.
|
||||
|
||||
@@ -34,7 +34,7 @@ Each line represents a 'node' (ROADM site or an in line amplifier site ILA or a
|
||||
- If filled, it can take "ROADM", "FUSED" or "ILA" values. If another string is used, it will be considered as not filled. FUSED means that ingress and egress spans will be fused together.
|
||||
|
||||
- *State*, *Country*, *Region* are not mandatory.
|
||||
"Region" is a holdover from the CORONET topology reference file `CORONET_Global_Topology.xlsx <examples/CORONET_Global_Topology.xlsx>`_. CORONET separates its network into geographical regions (Europe, Asia, Continental US.) This information is not used by gnpy.
|
||||
"Region" is a holdover from the CORONET topology reference file `CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_. CORONET separates its network into geographical regions (Europe, Asia, Continental US.) This information is not used by gnpy.
|
||||
|
||||
- *Longitude*, *Latitude* are not mandatory. If filled they should contain numbers.
|
||||
|
||||
@@ -80,11 +80,11 @@ and a fiber span from node3 to node6::
|
||||
|
||||
- If filled it MUST contain numbers. If empty it is replaced by a default "80" km value.
|
||||
- If value is below 150 km, it is considered as a single (bidirectional) fiber span.
|
||||
- If value is over 150 km the `transmission_main_example.py <examples/transmission_main_example.py>`_ program will automatically suppose that intermediate span description are required and will generate fiber spans elements with "_1","_2", ... trailing strings which are not visible in the json output. The reason for the splitting is that current edfa usually do not support large span loss. The current assumption is that links larger than 150km will require intermediate amplification. This value will be revisited when Raman amplification is added”
|
||||
- If value is over 150 km the `gnpy-transmission-example`` program will automatically suppose that intermediate span description are required and will generate fiber spans elements with "_1","_2", ... trailing strings which are not visible in the json output. The reason for the splitting is that current edfa usually do not support large span loss. The current assumption is that links larger than 150km will require intermediate amplification. This value will be revisited when Raman amplification is added”
|
||||
|
||||
- **Fiber type** is not mandatory.
|
||||
|
||||
If filled it must contain types listed in `eqpt_config.json <examples/eqpt_config.json>`_ in "Fiber" list "type_variety".
|
||||
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Fiber" list "type_variety".
|
||||
If not filled it takes "SSMF" as default value.
|
||||
|
||||
- **Lineic att** is not mandatory.
|
||||
@@ -150,11 +150,11 @@ then Eqpt sheet should contain:
|
||||
C - amp3
|
||||
|
||||
|
||||
In case you already have filled Nodes and Links sheets `create_eqpt_sheet.py <examples/create_eqpt_sheet.py>`_ can be used to automatically create a template for the mandatory entries of the list.
|
||||
In case you already have filled Nodes and Links sheets `create_eqpt_sheet.py <gnpy/example-data/create_eqpt_sheet.py>`_ can be used to automatically create a template for the mandatory entries of the list.
|
||||
|
||||
.. code-block:: shell
|
||||
|
||||
$ cd examples
|
||||
$ cd $(gnpy-example-data)
|
||||
$ python create_eqpt_sheet.py meshTopologyExampleV2.xls
|
||||
|
||||
This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content can be directly copied into the Eqt sheet of the excel file. The user then can fill the values in the rest of the columns.
|
||||
@@ -167,7 +167,7 @@ This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content ca
|
||||
- **Node Z** is mandatory. It is the egress direction from the *Node A* site. Multiple Links between the same Node A and NodeZ is not supported.
|
||||
|
||||
- **amp type** is not mandatory.
|
||||
If filled it must contain types listed in `eqpt_config.json <examples/eqpt_config.json>`_ in "Edfa" list "type_variety".
|
||||
If filled it must contain types listed in `eqpt_config.json <gnpy/example-data/eqpt_config.json>`_ in "Edfa" list "type_variety".
|
||||
If not filled it takes "std_medium_gain" as default value.
|
||||
If filled with fused, a fused element with 0.0 dB loss will be placed instead of an amplifier. This might be used to avoid booster amplifier on a ROADM direction.
|
||||
|
||||
@@ -189,7 +189,7 @@ This generates a text file meshTopologyExampleV2_eqt_sheet.txt whose content ca
|
||||
Service sheet
|
||||
-------------
|
||||
|
||||
Service sheet is optional. It lists the services for which path and feasibility must be computed with path_requests_run.py.
|
||||
Service sheet is optional. It lists the services for which path and feasibility must be computed with ``gnpy-path_request``.
|
||||
|
||||
Service sheet must contain 11 columns::
|
||||
|
||||
@@ -220,10 +220,10 @@ Service sheet must contain 11 columns::
|
||||
|
||||
- **path bandwidth** is mandatory. It is the amount of capacity required between source and destination in Gbit/s. Value should be positive (non zero). It is used to compute the amount of required spectrum for the service.
|
||||
|
||||
path_requests_run.py
|
||||
------------------------
|
||||
gnpy-path_request
|
||||
-----------------
|
||||
|
||||
**Usage**: path_requests_run.py [-h] [-bi] [-v] [-o OUTPUT]
|
||||
**Usage**: gnpy-path-request [-h] [-bi] [-v] [-o OUTPUT]
|
||||
[network_filename xls or json] [service_filename xls or json] [eqpt_filename json]
|
||||
|
||||
optional arguments::
|
||||
@@ -235,12 +235,12 @@ optional arguments::
|
||||
|
||||
.. code-block:: shell
|
||||
|
||||
$ cd examples
|
||||
$ python path_requests_run.py meshTopologyExampleV2.xls service_file.json eqpt_file -o output_file.json
|
||||
$ cd $(gnpy-example-data)
|
||||
$ gnpy-path-request meshTopologyExampleV2.xls service_file.json eqpt_file -o output_file.json
|
||||
|
||||
A function that computes performances for a list of services provided in the service file (accepts json or excel format.
|
||||
|
||||
if the service <file.xls> is in xls format, path_requests_run.py converts it to a json file <file_services.json> following the Yang model for requesting Path Computation defined in `draft-ietf-teas-yang-path-computation-01.txt <https://www.ietf.org/id/draft-ietf-teas-yang-path-computation-01.pdf>`_. For PSE use, additional fields with trx type and mode have been added to the te-bandwidth field.
|
||||
if the service <file.xls> is in xls format, ``gnpy-path-request`` converts it to a json file <file_services.json> following the Yang model for requesting Path Computation defined in `draft-ietf-teas-yang-path-computation-01.txt <https://www.ietf.org/id/draft-ietf-teas-yang-path-computation-01.pdf>`_. For PSE use, additional fields with trx type and mode have been added to the te-bandwidth field.
|
||||
|
||||
A template for the json file can be found here: `service_template.json <service_template.json>`_
|
||||
|
||||
@@ -250,7 +250,7 @@ If a file is specified with the optional -o argument, the result of the computat
|
||||
|
||||
A template for the result of computation json file can be found here: `path_result_template.json <path_result_template.json>`_
|
||||
|
||||
Important note: path_requests_run.py is not a network dimensionning tool : each service does not reserve spectrum, or occupy ressources such as transponders. It only computes path feasibility assuming the spectrum (between defined frequencies) is loaded with "nb of channels" spaced by "spacing" values as specified in the system parameters input in the service file, each cannel having the same characteristics in terms of baudrate, format, ... as the service transponder. The transceiver element acts as a "logical starting/stopping point" for the spectral information propagation. At that point it is not meant to represent the capacity of add drop ports
|
||||
Important note: ``gnpy-path-request`` is not a network dimensionning tool : each service does not reserve spectrum, or occupy ressources such as transponders. It only computes path feasibility assuming the spectrum (between defined frequencies) is loaded with "nb of channels" spaced by "spacing" values as specified in the system parameters input in the service file, each cannel having the same characteristics in terms of baudrate, format, ... as the service transponder. The transceiver element acts as a "logical starting/stopping point" for the spectral information propagation. At that point it is not meant to represent the capacity of add drop ports
|
||||
As a result transponder type is not part of the network info. it is related to the list of services requests.
|
||||
|
||||
The current version includes a spectrum assigment features that enables to compute a candidate spectrum assignment for each service based on a first fit policy. Spectrum is assigned based on service specified spacing value, path_bandwidth value and selected mode for the transceiver. This spectrum assignment includes a basic capacity planning capability so that the spectrum resource is limited by the frequency min and max values defined for the links. If the requested services reach the link spectrum capacity, additional services feasibility are computed but marked as blocked due to spectrum reason.
|
||||
|
||||
52
README.rst
52
README.rst
@@ -52,16 +52,16 @@ On Linux and Mac, run:
|
||||
.. code-block:: shell-session
|
||||
|
||||
$ docker run -it --rm --volume $(pwd):/shared telecominfraproject/oopt-gnpy
|
||||
root@bea050f186f7:/shared/examples#
|
||||
root@bea050f186f7:/shared/example-data#
|
||||
|
||||
On Windows, launch from Powershell as:
|
||||
|
||||
.. code-block:: powershell
|
||||
|
||||
PS C:\> docker run -it --rm --volume ${PWD}:/shared telecominfraproject/oopt-gnpy
|
||||
root@89784e577d44:/shared/examples#
|
||||
root@89784e577d44:/shared/example-data#
|
||||
|
||||
In both cases, a directory named ``examples/`` will appear in your current working directory.
|
||||
In both cases, a directory named ``example-data/`` will appear in your current working directory.
|
||||
GNPy automaticallly populates it with example files from the current release.
|
||||
Remove that directory if you want to start from scratch.
|
||||
|
||||
@@ -157,25 +157,25 @@ This example demonstrates how GNPy can be used to check the expected SNR at the
|
||||
:target: https://asciinema.org/a/252295
|
||||
|
||||
By default, this script operates on a single span network defined in
|
||||
`examples/edfa_example_network.json <examples/edfa_example_network.json>`_
|
||||
`gnpy/example-data/edfa_example_network.json <gnpy/example-data/edfa_example_network.json>`_
|
||||
|
||||
You can specify a different network at the command line as follows. For
|
||||
example, to use the CORONET Global network defined in
|
||||
`examples/CORONET_Global_Topology.json <examples/CORONET_Global_Topology.json>`_:
|
||||
`gnpy/example-data/CORONET_Global_Topology.json <gnpy/example-data/CORONET_Global_Topology.json>`_:
|
||||
|
||||
.. code-block:: shell-session
|
||||
|
||||
$ ./examples/transmission_main_example.py examples/CORONET_Global_Topology.json
|
||||
$ gnpy-transmission-example $(gnpy-example-data)/CORONET_Global_Topology.json
|
||||
|
||||
It is also possible to use an Excel file input (for example
|
||||
`examples/CORONET_Global_Topology.xlsx <examples/CORONET_Global_Topology.xlsx>`_).
|
||||
`gnpy/example-data/CORONET_Global_Topology.xlsx <gnpy/example-data/CORONET_Global_Topology.xlsx>`_).
|
||||
The Excel file will be processed into a JSON file with the same prefix. For
|
||||
further instructions on how to prepare the Excel input file, see
|
||||
`Excel_userguide.rst <Excel_userguide.rst>`_.
|
||||
|
||||
The main transmission example will calculate the average signal OSNR and SNR
|
||||
across network elements (transceiver, ROADMs, fibers, and amplifiers)
|
||||
between two transceivers selected by the user. Additional details are provided by doing ``transmission_main_example.py -h``. (By default, for the CORONET Global
|
||||
between two transceivers selected by the user. Additional details are provided by doing ``gnpy-transmission-example -h``. (By default, for the CORONET Global
|
||||
network, it will show the transmission of spectral information between Abilene and Albany)
|
||||
|
||||
This script calculates the average signal OSNR = |OSNR| and SNR = |SNR|.
|
||||
@@ -189,12 +189,12 @@ interference noise.
|
||||
.. |Pase| replace:: P\ :sub:`ase`
|
||||
.. |Pnli| replace:: P\ :sub:`nli`
|
||||
|
||||
Further Instructions for Use (`transmission_main_example.py`, `path_requests_run.py`)
|
||||
-------------------------------------------------------------------------------------
|
||||
Further Instructions for Use
|
||||
----------------------------
|
||||
|
||||
Design and transmission parameters are defined in a dedicated json file. By
|
||||
default, this information is read from `examples/eqpt_config.json
|
||||
<examples/eqpt_config.json>`_. This file defines the equipment libraries that
|
||||
default, this information is read from `gnpy/example-data/eqpt_config.json
|
||||
<gnpy/example-data/eqpt_config.json>`_. This file defines the equipment libraries that
|
||||
can be customized (EDFAs, fibers, and transceivers).
|
||||
|
||||
It also defines the simulation parameters (spans, ROADMs, and the spectral
|
||||
@@ -205,7 +205,7 @@ can be added and existing ones removed. Three different noise models are availab
|
||||
|
||||
1. ``'type_def': 'variable_gain'`` is a simplified model simulating a 2-coil EDFA with internal, input and output VOAs. The NF vs gain response is calculated accordingly based on the input parameters: ``nf_min``, ``nf_max``, and ``gain_flatmax``. It is not a simple interpolation but a 2-stage NF calculation.
|
||||
2. ``'type_def': 'fixed_gain'`` is a fixed gain model. `NF == Cte == nf0` if `gain_min < gain < gain_flatmax`
|
||||
3. ``'type_def': None`` is an advanced model. A detailed JSON configuration file is required (by default `examples/std_medium_gain_advanced_config.json <examples/std_medium_gain_advanced_config.json>`_). It uses a 3rd order polynomial where NF = f(gain), NF_ripple = f(frequency), gain_ripple = f(frequency), N-array dgt = f(frequency). Compared to the previous models, NF ripple and gain ripple are modelled.
|
||||
3. ``'type_def': None`` is an advanced model. A detailed JSON configuration file is required (by default `gnpy/example-data/std_medium_gain_advanced_config.json <gnpy/example-data/std_medium_gain_advanced_config.json>`_). It uses a 3rd order polynomial where NF = f(gain), NF_ripple = f(frequency), gain_ripple = f(frequency), N-array dgt = f(frequency). Compared to the previous models, NF ripple and gain ripple are modelled.
|
||||
|
||||
For all amplifier models:
|
||||
|
||||
@@ -244,7 +244,7 @@ The fiber library currently describes SSMF and NZDF but additional fiber types c
|
||||
The transceiver equipment library is a list of supported transceivers. New
|
||||
transceivers can be added and existing ones removed at will by the user. It is
|
||||
used to determine the service list path feasibility when running the
|
||||
`path_request_run.py routine <examples/path_request_run.py>`_.
|
||||
`path_request_run.py routine <gnpy/example-data/path_request_run.py>`_.
|
||||
|
||||
+----------------------+-----------+-----------------------------------------+
|
||||
| field | type | description |
|
||||
@@ -499,7 +499,7 @@ one power/channel definition.
|
||||
| | | transceiver OSNR. |
|
||||
+----------------------+-----------+-------------------------------------------+
|
||||
|
||||
The `transmission_main_example.py <examples/transmission_main_example.py>`_ script propagates a spectrum of channels at 32 Gbaud, 50 GHz spacing and 0 dBm/channel.
|
||||
The ``gnpy-transmission-example`` script propagates a spectrum of channels at 32 Gbaud, 50 GHz spacing and 0 dBm/channel.
|
||||
Launch power can be overridden by using the ``--power`` argument.
|
||||
Spectrum information is not yet parametrized but can be modified directly in the ``eqpt_config.json`` (via the ``SpectralInformation`` -SI- structure) to accommodate any baud rate or spacing.
|
||||
The number of channel is computed based on ``spacing`` and ``f_min``, ``f_max`` values.
|
||||
@@ -508,19 +508,19 @@ An experimental support for Raman amplification is available:
|
||||
|
||||
.. code-block:: shell
|
||||
|
||||
$ ./examples/transmission_main_example.py \
|
||||
examples/raman_edfa_example_network.json \
|
||||
--sim examples/sim_params.json --show-channels
|
||||
$ gnpy-transmission-example \
|
||||
$(gnpy-example-data)/raman_edfa_example_network.json \
|
||||
--sim $(gnpy-example-data)/sim_params.json --show-channels
|
||||
|
||||
Configuration of Raman pumps (their frequencies, power and pumping direction) is done via the `RamanFiber element in the network topology <examples/raman_edfa_example_network.json>`_.
|
||||
General numeric parameters for simulaiton control are provided in the `examples/sim_params.json <examples/sim_params.json>`_.
|
||||
Configuration of Raman pumps (their frequencies, power and pumping direction) is done via the `RamanFiber element in the network topology <gnpy/example-data/raman_edfa_example_network.json>`_.
|
||||
General numeric parameters for simulaiton control are provided in the `gnpy/example-data/sim_params.json <gnpy/example-data/sim_params.json>`_.
|
||||
|
||||
Use `examples/path_requests_run.py <examples/path_requests_run.py>`_ to run multiple optimizations as follows:
|
||||
Use ``gnpy-path-request`` to run multiple optimizations as follows:
|
||||
|
||||
.. code-block:: shell
|
||||
|
||||
$ python path_requests_run.py -h
|
||||
Usage: path_requests_run.py [-h] [-v] [-o OUTPUT] [network_filename] [service_filename] [eqpt_filename]
|
||||
$ gnpy-path-request -h
|
||||
Usage: gnpy-path-requests [-h] [-v] [-o OUTPUT] [network_filename] [service_filename] [eqpt_filename]
|
||||
|
||||
The ``network_filename`` and ``service_filename`` can be an XLS or JSON file. The ``eqpt_filename`` must be a JSON file.
|
||||
|
||||
@@ -528,13 +528,13 @@ To see an example of it, run:
|
||||
|
||||
.. code-block:: shell
|
||||
|
||||
$ cd examples
|
||||
$ python path_requests_run.py meshTopologyExampleV2.xls meshTopologyExampleV2_services.json eqpt_config.json -o output_file.json
|
||||
$ cd $(gnpy-example-data)
|
||||
$ gnpy-path-request meshTopologyExampleV2.xls meshTopologyExampleV2_services.json eqpt_config.json -o output_file.json
|
||||
|
||||
This program requires a list of connections to be estimated and the equipment
|
||||
library. The program computes performances for the list of services (accepts
|
||||
JSON or Excel format) using the same spectrum propagation modules as
|
||||
``transmission_main_example.py``. Explanation on the Excel template is provided in
|
||||
``gnpy-transmission-example``. Explanation on the Excel template is provided in
|
||||
the `Excel_userguide.rst <Excel_userguide.rst#service-sheet>`_. Template for
|
||||
the JSON format can be found here: `service-template.json
|
||||
<service-template.json>`_.
|
||||
|
||||
@@ -1,194 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
"""
|
||||
path_requests_run.py
|
||||
====================
|
||||
|
||||
Reads a JSON request file in accordance with the Yang model
|
||||
for requesting path computation and returns path results in terms
|
||||
of path and feasibilty.
|
||||
|
||||
See: draft-ietf-teas-yang-path-computation-01.txt
|
||||
"""
|
||||
|
||||
from sys import exit
|
||||
from argparse import ArgumentParser
|
||||
from pathlib import Path
|
||||
from logging import getLogger, basicConfig, CRITICAL, DEBUG, INFO
|
||||
from json import dumps
|
||||
from numpy import mean
|
||||
from gnpy.core import ansi_escapes
|
||||
from gnpy.core.utils import automatic_nch
|
||||
from gnpy.core.network import build_network
|
||||
from gnpy.core.utils import lin2db
|
||||
import gnpy.core.exceptions as exceptions
|
||||
from gnpy.topology.request import (ResultElement, jsontocsv, compute_path_dsjctn, requests_aggregation,
|
||||
BLOCKING_NOPATH, correct_json_route_list,
|
||||
deduplicate_disjunctions, compute_path_with_disjunction)
|
||||
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
|
||||
import gnpy.tools.cli_examples as cli_examples
|
||||
from gnpy.tools.json_io import load_requests, save_network, requests_from_json, disjunctions_from_json
|
||||
from math import ceil
|
||||
|
||||
#EQPT_LIBRARY_FILENAME = Path(__file__).parent / 'eqpt_config.json'
|
||||
|
||||
LOGGER = getLogger(__name__)
|
||||
|
||||
PARSER = ArgumentParser(description='A function that computes performances for a list of ' +
|
||||
'services provided in a json file or an excel sheet.')
|
||||
PARSER.add_argument('network_filename', nargs='?', type=Path,\
|
||||
default=Path(__file__).parent / 'meshTopologyExampleV2.xls',\
|
||||
help='input topology file in xls or json')
|
||||
PARSER.add_argument('service_filename', nargs='?', type=Path,\
|
||||
default=Path(__file__).parent / 'meshTopologyExampleV2.xls',\
|
||||
help='input service file in xls or json')
|
||||
PARSER.add_argument('eqpt_filename', nargs='?', type=Path,\
|
||||
default=Path(__file__).parent / 'eqpt_config.json',\
|
||||
help='input equipment library in json. Default is eqpt_config.json')
|
||||
PARSER.add_argument('-bi', '--bidir', action='store_true',\
|
||||
help='considers that all demands are bidir')
|
||||
PARSER.add_argument('-v', '--verbose', action='count', default=0,\
|
||||
help='increases verbosity for each occurence')
|
||||
PARSER.add_argument('-o', '--output', type=Path)
|
||||
|
||||
|
||||
def path_result_json(pathresult):
|
||||
""" create the response dictionnary
|
||||
"""
|
||||
data = {
|
||||
'response': [n.json for n in pathresult]
|
||||
}
|
||||
return data
|
||||
|
||||
def main(args):
|
||||
""" main function that calls all functions
|
||||
"""
|
||||
LOGGER.info(f'Computing path requests {args.service_filename} into JSON format')
|
||||
print(f'{ansi_escapes.blue}Computing path requests {args.service_filename} into JSON format{ansi_escapes.reset}')
|
||||
# for debug
|
||||
# print( args.eqpt_filename)
|
||||
|
||||
(equipment, network) = cli_examples.load_common_data(args.eqpt_filename, args.network_filename)
|
||||
|
||||
# Build the network once using the default power defined in SI in eqpt config
|
||||
# TODO power density: db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
|
||||
# spacing, f_min and f_max
|
||||
p_db = equipment['SI']['default'].power_dbm
|
||||
|
||||
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,\
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
save_network(args.network_filename, network)
|
||||
oms_list = build_oms_list(network, equipment)
|
||||
|
||||
try:
|
||||
data = load_requests(args.service_filename, equipment, bidir=args.bidir, network=network, network_filename=args.network_filename)
|
||||
rqs = requests_from_json(data, equipment)
|
||||
except exceptions.ServiceError as e:
|
||||
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
# check that request ids are unique. Non unique ids, may
|
||||
# mess the computation: better to stop the computation
|
||||
all_ids = [r.request_id for r in rqs]
|
||||
if len(all_ids) != len(set(all_ids)):
|
||||
for item in list(set(all_ids)):
|
||||
all_ids.remove(item)
|
||||
msg = f'Requests id {all_ids} are not unique'
|
||||
LOGGER.critical(msg)
|
||||
exit()
|
||||
rqs = correct_json_route_list(network, rqs)
|
||||
|
||||
# pths = compute_path(network, equipment, rqs)
|
||||
dsjn = disjunctions_from_json(data)
|
||||
|
||||
print(f'{ansi_escapes.blue}List of disjunctions{ansi_escapes.reset}')
|
||||
print(dsjn)
|
||||
# need to warn or correct in case of wrong disjunction form
|
||||
# disjunction must not be repeated with same or different ids
|
||||
dsjn = deduplicate_disjunctions(dsjn)
|
||||
|
||||
# Aggregate demands with same exact constraints
|
||||
print(f'{ansi_escapes.blue}Aggregating similar requests{ansi_escapes.reset}')
|
||||
|
||||
rqs, dsjn = requests_aggregation(rqs, dsjn)
|
||||
# TODO export novel set of aggregated demands in a json file
|
||||
|
||||
print(f'{ansi_escapes.blue}The following services have been requested:{ansi_escapes.reset}')
|
||||
print(rqs)
|
||||
|
||||
print(f'{ansi_escapes.blue}Computing all paths with constraints{ansi_escapes.reset}')
|
||||
try:
|
||||
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
|
||||
except exceptions.DisjunctionError as this_e:
|
||||
print(f'{ansi_escapes.red}Disjunction error:{ansi_escapes.reset} {this_e}')
|
||||
exit(1)
|
||||
|
||||
print(f'{ansi_escapes.blue}Propagating on selected path{ansi_escapes.reset}')
|
||||
propagatedpths, reversed_pths, reversed_propagatedpths = compute_path_with_disjunction(network, equipment, rqs, pths)
|
||||
# Note that deepcopy used in compute_path_with_disjunction returns
|
||||
# a list of nodes which are not belonging to network (they are copies of the node objects).
|
||||
# so there can not be propagation on these nodes.
|
||||
|
||||
pth_assign_spectrum(pths, rqs, oms_list, reversed_pths)
|
||||
|
||||
print(f'{ansi_escapes.blue}Result summary{ansi_escapes.reset}')
|
||||
header = ['req id', ' demand', ' snr@bandwidth A-Z (Z-A)', ' snr@0.1nm A-Z (Z-A)',\
|
||||
' Receiver minOSNR', ' mode', ' Gbit/s', ' nb of tsp pairs',\
|
||||
'N,M or blocking reason']
|
||||
data = []
|
||||
data.append(header)
|
||||
for i, this_p in enumerate(propagatedpths):
|
||||
rev_pth = reversed_propagatedpths[i]
|
||||
if rev_pth and this_p:
|
||||
psnrb = f'{round(mean(this_p[-1].snr),2)} ({round(mean(rev_pth[-1].snr),2)})'
|
||||
psnr = f'{round(mean(this_p[-1].snr_01nm), 2)}' +\
|
||||
f' ({round(mean(rev_pth[-1].snr_01nm),2)})'
|
||||
elif this_p:
|
||||
psnrb = f'{round(mean(this_p[-1].snr),2)}'
|
||||
psnr = f'{round(mean(this_p[-1].snr_01nm),2)}'
|
||||
|
||||
try :
|
||||
if rqs[i].blocking_reason in BLOCKING_NOPATH:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} :',\
|
||||
f'-', f'-', f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',\
|
||||
f'-', f'{rqs[i].blocking_reason}']
|
||||
else:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,\
|
||||
psnr, f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9, 2)}',\
|
||||
f'-', f'{rqs[i].blocking_reason}']
|
||||
except AttributeError:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,\
|
||||
psnr, f'{rqs[i].OSNR}', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',\
|
||||
f'{ceil(rqs[i].path_bandwidth / rqs[i].bit_rate) }', f'({rqs[i].N},{rqs[i].M})']
|
||||
data.append(line)
|
||||
|
||||
col_width = max(len(word) for row in data for word in row[2:]) # padding
|
||||
firstcol_width = max(len(row[0]) for row in data) # padding
|
||||
secondcol_width = max(len(row[1]) for row in data) # padding
|
||||
for row in data:
|
||||
firstcol = ''.join(row[0].ljust(firstcol_width))
|
||||
secondcol = ''.join(row[1].ljust(secondcol_width))
|
||||
remainingcols = ''.join(word.center(col_width, ' ') for word in row[2:])
|
||||
print(f'{firstcol} {secondcol} {remainingcols}')
|
||||
print(f'{ansi_escapes.yellow}Result summary shows mean SNR and OSNR (average over all channels){ansi_escapes.reset}')
|
||||
|
||||
if args.output:
|
||||
result = []
|
||||
# assumes that list of rqs and list of propgatedpths have same order
|
||||
for i, pth in enumerate(propagatedpths):
|
||||
result.append(ResultElement(rqs[i], pth, reversed_propagatedpths[i]))
|
||||
temp = path_result_json(result)
|
||||
fnamecsv = f'{str(args.output)[0:len(str(args.output))-len(str(args.output.suffix))]}.csv'
|
||||
fnamejson = f'{str(args.output)[0:len(str(args.output))-len(str(args.output.suffix))]}.json'
|
||||
with open(fnamejson, 'w', encoding='utf-8') as fjson:
|
||||
fjson.write(dumps(path_result_json(result), indent=2, ensure_ascii=False))
|
||||
with open(fnamecsv, "w", encoding='utf-8') as fcsv:
|
||||
jsontocsv(temp, equipment, fcsv)
|
||||
print('\x1b[1;34;40m'+f'saving in {args.output} and {fnamecsv}'+ '\x1b[0m')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
ARGS = PARSER.parse_args()
|
||||
basicConfig(level={2: DEBUG, 1: INFO, 0: CRITICAL}.get(ARGS.verbose, DEBUG))
|
||||
main(ARGS)
|
||||
@@ -1,233 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
'''
|
||||
transmission_main_example.py
|
||||
============================
|
||||
|
||||
Main example for transmission simulation.
|
||||
|
||||
Reads from network JSON (by default, `edfa_example_network.json`)
|
||||
'''
|
||||
|
||||
from argparse import ArgumentParser
|
||||
from sys import exit
|
||||
from pathlib import Path
|
||||
from logging import getLogger, basicConfig, INFO, ERROR, DEBUG
|
||||
from numpy import linspace, mean
|
||||
from gnpy.core.equipment import trx_mode_params
|
||||
from gnpy.core.network import build_network
|
||||
from gnpy.core.elements import Transceiver, Fiber, RamanFiber
|
||||
from gnpy.core.utils import db2lin, lin2db, write_csv
|
||||
import gnpy.core.ansi_escapes as ansi_escapes
|
||||
from gnpy.topology.request import PathRequest, compute_constrained_path, propagate2
|
||||
import gnpy.tools.cli_examples as cli_examples
|
||||
from gnpy.tools.json_io import save_network
|
||||
from gnpy.tools.plots import plot_baseline, plot_results
|
||||
|
||||
|
||||
logger = getLogger(__name__)
|
||||
|
||||
|
||||
def main(network, equipment, source, destination, req=None):
|
||||
result_dicts = {}
|
||||
network_data = [{
|
||||
'network_name' : str(args.filename),
|
||||
'source' : source.uid,
|
||||
'destination' : destination.uid
|
||||
}]
|
||||
result_dicts.update({'network': network_data})
|
||||
design_data = [{
|
||||
'power_mode' : equipment['Span']['default'].power_mode,
|
||||
'span_power_range' : equipment['Span']['default'].delta_power_range_db,
|
||||
'design_pch' : equipment['SI']['default'].power_dbm,
|
||||
'baud_rate' : equipment['SI']['default'].baud_rate
|
||||
}]
|
||||
result_dicts.update({'design': design_data})
|
||||
simulation_data = []
|
||||
result_dicts.update({'simulation results': simulation_data})
|
||||
|
||||
power_mode = equipment['Span']['default'].power_mode
|
||||
print('\n'.join([f'Power mode is set to {power_mode}',
|
||||
f'=> it can be modified in eqpt_config.json - Span']))
|
||||
|
||||
pref_ch_db = lin2db(req.power*1e3) #reference channel power / span (SL=20dB)
|
||||
pref_total_db = pref_ch_db + lin2db(req.nb_channel) #reference total power / span (SL=20dB)
|
||||
build_network(network, equipment, pref_ch_db, pref_total_db)
|
||||
path = compute_constrained_path(network, req)
|
||||
|
||||
spans = [s.params.length for s in path if isinstance(s, RamanFiber) or isinstance(s, Fiber)]
|
||||
print(f'\nThere are {len(spans)} fiber spans over {sum(spans)/1000:.0f} km between {source.uid} '
|
||||
f'and {destination.uid}')
|
||||
print(f'\nNow propagating between {source.uid} and {destination.uid}:')
|
||||
|
||||
try:
|
||||
p_start, p_stop, p_step = equipment['SI']['default'].power_range_db
|
||||
p_num = abs(int(round((p_stop - p_start)/p_step))) + 1 if p_step != 0 else 1
|
||||
power_range = list(linspace(p_start, p_stop, p_num))
|
||||
except TypeError:
|
||||
print('invalid power range definition in eqpt_config, should be power_range_db: [lower, upper, step]')
|
||||
power_range = [0]
|
||||
|
||||
if not power_mode:
|
||||
#power cannot be changed in gain mode
|
||||
power_range = [0]
|
||||
for dp_db in power_range:
|
||||
req.power = db2lin(pref_ch_db + dp_db)*1e-3
|
||||
if power_mode:
|
||||
print(f'\nPropagating with input power = {ansi_escapes.cyan}{lin2db(req.power*1e3):.2f} dBm{ansi_escapes.reset}:')
|
||||
else:
|
||||
print(f'\nPropagating in {ansi_escapes.cyan}gain mode{ansi_escapes.reset}: power cannot be set manually')
|
||||
infos = propagate2(path, req, equipment)
|
||||
if len(power_range) == 1:
|
||||
for elem in path:
|
||||
print(elem)
|
||||
if power_mode:
|
||||
print(f'\nTransmission result for input power = {lin2db(req.power*1e3):.2f} dBm:')
|
||||
else:
|
||||
print(f'\nTransmission results:')
|
||||
print(f' Final SNR total (0.1 nm): {ansi_escapes.cyan}{mean(destination.snr_01nm):.02f} dB{ansi_escapes.reset}')
|
||||
else:
|
||||
print(path[-1])
|
||||
|
||||
#print(f'\n !!!!!!!!!!!!!!!!! TEST POINT !!!!!!!!!!!!!!!!!!!!!')
|
||||
#print(f'carriers ase output of {path[1]} =\n {list(path[1].carriers("out", "nli"))}')
|
||||
# => use "in" or "out" parameter
|
||||
# => use "nli" or "ase" or "signal" or "total" parameter
|
||||
if power_mode:
|
||||
simulation_data.append({
|
||||
'Pch_dBm' : pref_ch_db + dp_db,
|
||||
'OSNR_ASE_0.1nm' : round(mean(destination.osnr_ase_01nm),2),
|
||||
'OSNR_ASE_signal_bw' : round(mean(destination.osnr_ase),2),
|
||||
'SNR_nli_signal_bw' : round(mean(destination.osnr_nli),2),
|
||||
'SNR_total_signal_bw' : round(mean(destination.snr),2)
|
||||
})
|
||||
else:
|
||||
simulation_data.append({
|
||||
'gain_mode' : 'power canot be set',
|
||||
'OSNR_ASE_0.1nm' : round(mean(destination.osnr_ase_01nm),2),
|
||||
'OSNR_ASE_signal_bw' : round(mean(destination.osnr_ase),2),
|
||||
'SNR_nli_signal_bw' : round(mean(destination.osnr_nli),2),
|
||||
'SNR_total_signal_bw' : round(mean(destination.snr),2)
|
||||
})
|
||||
write_csv(result_dicts, 'simulation_result.csv')
|
||||
return path, infos
|
||||
|
||||
|
||||
parser = ArgumentParser()
|
||||
parser.add_argument('-e', '--equipment', type=Path,
|
||||
default=Path(__file__).parent / 'eqpt_config.json')
|
||||
parser.add_argument('--sim-params', type=Path,
|
||||
default=None, help='Path to the JSON containing simulation parameters (required for Raman)')
|
||||
parser.add_argument('--show-channels', action='store_true', help='Show final per-channel OSNR summary')
|
||||
parser.add_argument('-pl', '--plot', action='store_true')
|
||||
parser.add_argument('-v', '--verbose', action='count', default=0, help='increases verbosity for each occurence')
|
||||
parser.add_argument('-l', '--list-nodes', action='store_true', help='list all transceiver nodes')
|
||||
parser.add_argument('-po', '--power', default=0, help='channel ref power in dBm')
|
||||
parser.add_argument('-names', '--names-matching', action='store_true', help='display network names that are closed matches')
|
||||
parser.add_argument('filename', nargs='?', type=Path,
|
||||
default=Path(__file__).parent / 'edfa_example_network.json')
|
||||
parser.add_argument('source', nargs='?', help='source node')
|
||||
parser.add_argument('destination', nargs='?', help='destination node')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
args = parser.parse_args()
|
||||
basicConfig(level={0: ERROR, 1: INFO, 2: DEBUG}.get(args.verbose, DEBUG))
|
||||
|
||||
(equipment, network) = cli_examples.load_common_data(args.equipment, args.filename, args.sim_params,
|
||||
fuzzy_name_matching=args.names_matching)
|
||||
|
||||
if args.plot:
|
||||
plot_baseline(network)
|
||||
|
||||
transceivers = {n.uid: n for n in network.nodes() if isinstance(n, Transceiver)}
|
||||
|
||||
if not transceivers:
|
||||
exit('Network has no transceivers!')
|
||||
if len(transceivers) < 2:
|
||||
exit('Network has only one transceiver!')
|
||||
|
||||
if args.list_nodes:
|
||||
for uid in transceivers:
|
||||
print(uid)
|
||||
exit()
|
||||
|
||||
#First try to find exact match if source/destination provided
|
||||
if args.source:
|
||||
source = transceivers.pop(args.source, None)
|
||||
valid_source = True if source else False
|
||||
else:
|
||||
source = None
|
||||
logger.info('No source node specified: picking random transceiver')
|
||||
|
||||
if args.destination:
|
||||
destination = transceivers.pop(args.destination, None)
|
||||
valid_destination = True if destination else False
|
||||
else:
|
||||
destination = None
|
||||
logger.info('No destination node specified: picking random transceiver')
|
||||
|
||||
#If no exact match try to find partial match
|
||||
if args.source and not source:
|
||||
#TODO code a more advanced regex to find nodes match
|
||||
source = next((transceivers.pop(uid) for uid in transceivers \
|
||||
if args.source.lower() in uid.lower()), None)
|
||||
|
||||
if args.destination and not destination:
|
||||
#TODO code a more advanced regex to find nodes match
|
||||
destination = next((transceivers.pop(uid) for uid in transceivers \
|
||||
if args.destination.lower() in uid.lower()), None)
|
||||
|
||||
#If no partial match or no source/destination provided pick random
|
||||
if not source:
|
||||
source = list(transceivers.values())[0]
|
||||
del transceivers[source.uid]
|
||||
|
||||
if not destination:
|
||||
destination = list(transceivers.values())[0]
|
||||
|
||||
logger.info(f'source = {args.source!r}')
|
||||
logger.info(f'destination = {args.destination!r}')
|
||||
|
||||
params = {}
|
||||
params['request_id'] = 0
|
||||
params['trx_type'] = ''
|
||||
params['trx_mode'] = ''
|
||||
params['source'] = source.uid
|
||||
params['destination'] = destination.uid
|
||||
params['bidir'] = False
|
||||
params['nodes_list'] = [destination.uid]
|
||||
params['loose_list'] = ['strict']
|
||||
params['format'] = ''
|
||||
params['path_bandwidth'] = 0
|
||||
trx_params = trx_mode_params(equipment)
|
||||
if args.power:
|
||||
trx_params['power'] = db2lin(float(args.power))*1e-3
|
||||
params.update(trx_params)
|
||||
req = PathRequest(**params)
|
||||
path, infos = main(network, equipment, source, destination, req)
|
||||
save_network(args.filename, network)
|
||||
|
||||
if args.show_channels:
|
||||
print('\nThe total SNR per channel at the end of the line is:')
|
||||
print('{:>5}{:>26}{:>26}{:>28}{:>28}{:>28}' \
|
||||
.format('Ch. #', 'Channel frequency (THz)', 'Channel power (dBm)', 'OSNR ASE (signal bw, dB)', 'SNR NLI (signal bw, dB)', 'SNR total (signal bw, dB)'))
|
||||
for final_carrier, ch_osnr, ch_snr_nl, ch_snr in zip(infos[path[-1]][1].carriers, path[-1].osnr_ase, path[-1].osnr_nli, path[-1].snr):
|
||||
ch_freq = final_carrier.frequency * 1e-12
|
||||
ch_power = lin2db(final_carrier.power.signal*1e3)
|
||||
print('{:5}{:26.2f}{:26.2f}{:28.2f}{:28.2f}{:28.2f}' \
|
||||
.format(final_carrier.channel_number, round(ch_freq, 2), round(ch_power, 2), round(ch_osnr, 2), round(ch_snr_nl, 2), round(ch_snr, 2)))
|
||||
|
||||
if not args.source:
|
||||
print(f'\n(No source node specified: picked {source.uid})')
|
||||
elif not valid_source:
|
||||
print(f'\n(Invalid source node {args.source!r} replaced with {source.uid})')
|
||||
|
||||
if not args.destination:
|
||||
print(f'\n(No destination node specified: picked {destination.uid})')
|
||||
elif not valid_destination:
|
||||
print(f'\n(Invalid destination node {args.destination!r} replaced with {destination.uid})')
|
||||
|
||||
if args.plot:
|
||||
plot_results(network, path, source, destination, infos)
|
||||
@@ -17,11 +17,16 @@ from argparse import ArgumentParser
|
||||
PARSER = ArgumentParser()
|
||||
PARSER.add_argument('workbook', nargs='?', default='meshTopologyExampleV2.xls',
|
||||
help='create the mandatory columns in Eqpt sheet')
|
||||
ALL_ROWS = lambda sh, start=0: (sh.row(x) for x in range(start, sh.nrows))
|
||||
|
||||
|
||||
def ALL_ROWS(sh, start=0):
|
||||
return (sh.row(x) for x in range(start, sh.nrows))
|
||||
|
||||
|
||||
class Node:
|
||||
""" Node element contains uid, list of connected nodes and eqpt type
|
||||
"""
|
||||
|
||||
def __init__(self, uid, to_node):
|
||||
self.uid = uid
|
||||
self.to_node = to_node
|
||||
@@ -33,6 +38,7 @@ class Node:
|
||||
def __str__(self):
|
||||
return f'uid {self.uid} \nto_node {[node for node in self.to_node]}\neqpt {self.eqpt}\n'
|
||||
|
||||
|
||||
def read_excel(input_filename):
|
||||
""" read excel Nodes and Links sheets and create a dict of nodes with
|
||||
their to_nodes and type of eqpt
|
||||
@@ -70,6 +76,7 @@ def read_excel(input_filename):
|
||||
exit()
|
||||
return nodes
|
||||
|
||||
|
||||
def create_eqt_template(nodes, input_filename):
|
||||
""" writes list of node A node Z corresponding to Nodes and Links sheets in order
|
||||
to help user populating Eqpt
|
||||
@@ -82,7 +89,6 @@ def create_eqt_template(nodes, input_filename):
|
||||
\nNode A \tNode Z \tamp type \tatt_in \tamp gain \ttilt \tatt_out\
|
||||
amp type \tatt_in \tamp gain \ttilt \tatt_out\n')
|
||||
|
||||
|
||||
for node in nodes.values():
|
||||
if node.eqpt == 'ILA':
|
||||
my_file.write(f'{node.uid}\t{node.to_node[0]}\n')
|
||||
@@ -90,8 +96,8 @@ def create_eqt_template(nodes, input_filename):
|
||||
for to_node in node.to_node:
|
||||
my_file.write(f'{node.uid}\t{to_node}\n')
|
||||
|
||||
print(f'File {output_filename} successfully created with Node A - Node Z ' +
|
||||
' entries for Eqpt sheet in excel file.')
|
||||
print(f'File {output_filename} successfully created with Node A - Node Z entries for Eqpt sheet in excel file.')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
ARGS = PARSER.parse_args()
|
||||
@@ -8,7 +8,7 @@ Amplifier models and configuration
|
||||
|
||||
Equipment description defines equipment types and parameters.
|
||||
It takes place in the default **eqpt_config.json** file.
|
||||
By default **transmission_main_example.py** uses **eqpt_config.json** file and that
|
||||
By default **gnpy-transmission-example** uses **eqpt_config.json** file and that
|
||||
can be changed with **-e** or **--equipment** command line parameter.
|
||||
|
||||
2. Amplifier parameters and subtypes
|
||||
@@ -266,7 +266,7 @@ In an opensource and multi-vendor environnement, it is needed to support differe
|
||||
4. advanced_config_from_json
|
||||
#######################################
|
||||
|
||||
The build_oa_json.py library in gnpy/examples/edfa_model can be used to build the json file required for the amplifier advanced_model type_def:
|
||||
The build_oa_json.py library in ``gnpy/example-data/edfa_model/`` can be used to build the json file required for the amplifier advanced_model type_def:
|
||||
|
||||
Update an existing json file with all the 96ch txt files for a given amplifier type
|
||||
amplifier type 'OA_type1' is hard coded but can be modified and other types added
|
||||
@@ -13,7 +13,6 @@ import re
|
||||
import sys
|
||||
import json
|
||||
import numpy as np
|
||||
from gnpy.core.utils import lin2db, db2lin
|
||||
|
||||
"""amplifier file names
|
||||
convert a set of amplifier files + input json definiton file into a valid edfa_json_file:
|
||||
@@ -28,60 +27,63 @@ input json file in argument (defult = 'OA.json')
|
||||
the json input file should have the following fields:
|
||||
{
|
||||
"nf_fit_coeff": "nf_filename.txt",
|
||||
"nf_ripple": "nf_ripple_filename.txt",
|
||||
"nf_ripple": "nf_ripple_filename.txt",
|
||||
"gain_ripple": "DFG_filename.txt",
|
||||
"dgt": "DGT_filename.txt",
|
||||
}
|
||||
|
||||
"""
|
||||
|
||||
input_json_file_name = "OA.json" #default path
|
||||
input_json_file_name = "OA.json" # default path
|
||||
output_json_file_name = "default_edfa_config.json"
|
||||
gain_ripple_field = "gain_ripple"
|
||||
nf_ripple_field = "nf_ripple"
|
||||
nf_fit_coeff = "nf_fit_coeff"
|
||||
|
||||
|
||||
def read_file(field, file_name):
|
||||
"""read and format the 96 channels txt files describing the amplifier NF and ripple
|
||||
convert dfg into gain ripple by removing the mean component
|
||||
"""
|
||||
|
||||
#with open(path + file_name,'r') as this_file:
|
||||
# with open(path + file_name,'r') as this_file:
|
||||
# data = this_file.read()
|
||||
#data.strip()
|
||||
# data.strip()
|
||||
#data = re.sub(r"([0-9])([ ]{1,3})([0-9-+])",r"\1,\3",data)
|
||||
#data = list(data.split(","))
|
||||
#data = [float(x) for x in data]
|
||||
data = np.loadtxt(file_name)
|
||||
print(len(data), file_name)
|
||||
if field == gain_ripple_field or field == nf_ripple_field:
|
||||
#consider ripple excursion only to avoid redundant information
|
||||
#because the max flat_gain is already given by the 'gain_flat' field in json
|
||||
#remove the mean component
|
||||
# consider ripple excursion only to avoid redundant information
|
||||
# because the max flat_gain is already given by the 'gain_flat' field in json
|
||||
# remove the mean component
|
||||
print(file_name, ', mean value =', data.mean(), ' is substracted')
|
||||
data = data - data.mean()
|
||||
data = data.tolist()
|
||||
return data
|
||||
|
||||
|
||||
def input_json(path):
|
||||
"""read the json input file and add all the 96 channels txt files
|
||||
create the output json file with output_json_file_name"""
|
||||
with open(path,'r') as edfa_json_file:
|
||||
with open(path, 'r') as edfa_json_file:
|
||||
amp_text = edfa_json_file.read()
|
||||
amp_dict = json.loads(amp_text)
|
||||
|
||||
for k, v in amp_dict.items():
|
||||
if re.search(r'.txt$',str(v)) :
|
||||
if re.search(r'.txt$', str(v)):
|
||||
amp_dict[k] = read_file(k, v)
|
||||
|
||||
amp_text = json.dumps(amp_dict, indent=4)
|
||||
#print(amp_text)
|
||||
with open(output_json_file_name,'w') as edfa_json_file:
|
||||
# print(amp_text)
|
||||
with open(output_json_file_name, 'w') as edfa_json_file:
|
||||
edfa_json_file.write(amp_text)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if len(sys.argv) == 2:
|
||||
path = sys.argv[1]
|
||||
else:
|
||||
path = input_json_file_name
|
||||
input_json(path)
|
||||
input_json(path)
|
||||
@@ -18,10 +18,10 @@ from gnpy.tools.json_io import load_equipment
|
||||
from gnpy.topology.request import jsontocsv
|
||||
|
||||
|
||||
parser = ArgumentParser(description = 'A function that writes json path results in an excel sheet.')
|
||||
parser.add_argument('filename', nargs='?', type = Path)
|
||||
parser.add_argument('output_filename', nargs='?', type = Path)
|
||||
parser.add_argument('eqpt_filename', nargs='?', type = Path, default=Path(__file__).parent / 'eqpt_config.json')
|
||||
parser = ArgumentParser(description='A function that writes json path results in an excel sheet.')
|
||||
parser.add_argument('filename', nargs='?', type=Path)
|
||||
parser.add_argument('output_filename', nargs='?', type=Path)
|
||||
parser.add_argument('eqpt_filename', nargs='?', type=Path, default=Path(__file__).parent / 'eqpt_config.json')
|
||||
|
||||
if __name__ == '__main__':
|
||||
args = parser.parse_args()
|
||||
@@ -32,5 +32,4 @@ if __name__ == '__main__':
|
||||
json_data = loads(f.read())
|
||||
equipment = load_equipment(args.eqpt_filename)
|
||||
print(f'Writing in {args.output_filename}')
|
||||
jsontocsv(json_data,equipment,file)
|
||||
|
||||
jsontocsv(json_data, equipment, file)
|
||||
@@ -8,41 +8,426 @@ gnpy.tools.cli_examples
|
||||
Common code for CLI examples
|
||||
'''
|
||||
|
||||
import argparse
|
||||
from json import dumps
|
||||
import logging
|
||||
import os.path
|
||||
import sys
|
||||
from math import ceil
|
||||
from numpy import linspace, mean
|
||||
from pathlib import Path
|
||||
import gnpy.core.ansi_escapes as ansi_escapes
|
||||
from gnpy.core.elements import RamanFiber
|
||||
from gnpy.core.elements import Transceiver, Fiber, RamanFiber
|
||||
from gnpy.core.equipment import trx_mode_params
|
||||
import gnpy.core.exceptions as exceptions
|
||||
from gnpy.core.network import build_network
|
||||
from gnpy.core.parameters import SimParams
|
||||
from gnpy.core.science_utils import Simulation
|
||||
from gnpy.tools.json_io import load_equipment, load_network, load_json
|
||||
from gnpy.core.utils import db2lin, lin2db, automatic_nch
|
||||
from gnpy.topology.request import (ResultElement, jsontocsv, compute_path_dsjctn, requests_aggregation,
|
||||
BLOCKING_NOPATH, correct_json_route_list,
|
||||
deduplicate_disjunctions, compute_path_with_disjunction,
|
||||
PathRequest, compute_constrained_path, propagate2)
|
||||
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
|
||||
from gnpy.tools.json_io import load_equipment, load_network, load_json, load_requests, save_network, \
|
||||
requests_from_json, disjunctions_from_json, save_json
|
||||
from gnpy.tools.plots import plot_baseline, plot_results
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
_examples_dir = Path(__file__).parent.parent / 'example-data'
|
||||
_help_footer = '''
|
||||
This program is part of GNPy, https://github.com/TelecomInfraProject/oopt-gnpy
|
||||
|
||||
Learn more at https://gnpy.readthedocs.io/
|
||||
|
||||
'''
|
||||
_help_fname_json = 'FILE.json'
|
||||
_help_fname_json_csv = 'FILE.(json|csv)'
|
||||
|
||||
|
||||
def load_common_data(equipment_filename, topology_filename, simulation_filename=None, fuzzy_name_matching=False):
|
||||
def show_example_data_dir():
|
||||
print(f'{_examples_dir}/')
|
||||
|
||||
|
||||
def load_common_data(equipment_filename, topology_filename, simulation_filename, save_raw_network_filename):
|
||||
'''Load common configuration from JSON files'''
|
||||
|
||||
try:
|
||||
equipment = load_equipment(equipment_filename)
|
||||
network = load_network(topology_filename, equipment, fuzzy_name_matching)
|
||||
network = load_network(topology_filename, equipment)
|
||||
if save_raw_network_filename is not None:
|
||||
save_network(network, save_raw_network_filename)
|
||||
print(f'{ansi_escapes.blue}Raw network (no optimizations) saved to {save_raw_network_filename}{ansi_escapes.reset}')
|
||||
sim_params = SimParams(**load_json(simulation_filename)) if simulation_filename is not None else None
|
||||
if not sim_params:
|
||||
if next((node for node in network if isinstance(node, RamanFiber)), None) is not None:
|
||||
print(f'{ansi_escapes.red}Invocation error:{ansi_escapes.reset} '
|
||||
f'RamanFiber requires passing simulation params via --sim-params')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
else:
|
||||
Simulation.set_params(sim_params)
|
||||
except exceptions.EquipmentConfigError as e:
|
||||
print(f'{ansi_escapes.red}Configuration error in the equipment library:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
except exceptions.NetworkTopologyError as e:
|
||||
print(f'{ansi_escapes.red}Invalid network definition:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
except exceptions.ConfigurationError as e:
|
||||
print(f'{ansi_escapes.red}Configuration error:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
except exceptions.ParametersError as e:
|
||||
print(f'{ansi_escapes.red}Simulation parameters error:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
except exceptions.ServiceError as e:
|
||||
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
|
||||
exit(1)
|
||||
sys.exit(1)
|
||||
|
||||
return (equipment, network)
|
||||
|
||||
|
||||
def _setup_logging(args):
|
||||
logging.basicConfig(level={2: logging.DEBUG, 1: logging.INFO, 0: logging.CRITICAL}.get(args.verbose, logging.DEBUG))
|
||||
|
||||
|
||||
def _add_common_options(parser: argparse.ArgumentParser, network_default: Path):
|
||||
parser.add_argument('topology', nargs='?', type=Path, metavar='NETWORK-TOPOLOGY.(json|xls|xlsx)',
|
||||
default=network_default,
|
||||
help='Input network topology')
|
||||
parser.add_argument('-v', '--verbose', action='count', default=0,
|
||||
help='Increase verbosity (can be specified several times)')
|
||||
parser.add_argument('-e', '--equipment', type=Path, metavar=_help_fname_json,
|
||||
default=_examples_dir / 'eqpt_config.json', help='Equipment library')
|
||||
parser.add_argument('--sim-params', type=Path, metavar=_help_fname_json,
|
||||
default=None, help='Path to the JSON containing simulation parameters (required for Raman). '
|
||||
f'Example: {_examples_dir / "sim_params.json"}')
|
||||
parser.add_argument('--save-network', type=Path, metavar=_help_fname_json,
|
||||
help='Save the final network as a JSON file')
|
||||
parser.add_argument('--save-network-before-autodesign', type=Path, metavar=_help_fname_json,
|
||||
help='Dump the network into a JSON file prior to autodesign')
|
||||
|
||||
|
||||
def transmission_main_example(args=None):
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Send a full spectrum load through the network from point A to point B',
|
||||
epilog=_help_footer,
|
||||
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
|
||||
)
|
||||
_add_common_options(parser, network_default=_examples_dir / 'edfa_example_network.json')
|
||||
parser.add_argument('--show-channels', action='store_true', help='Show final per-channel OSNR summary')
|
||||
parser.add_argument('-pl', '--plot', action='store_true')
|
||||
parser.add_argument('-l', '--list-nodes', action='store_true', help='list all transceiver nodes')
|
||||
parser.add_argument('-po', '--power', default=0, help='channel ref power in dBm')
|
||||
parser.add_argument('source', nargs='?', help='source node')
|
||||
parser.add_argument('destination', nargs='?', help='destination node')
|
||||
|
||||
args = parser.parse_args(args if args is not None else sys.argv[1:])
|
||||
_setup_logging(args)
|
||||
|
||||
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
|
||||
|
||||
if args.plot:
|
||||
plot_baseline(network)
|
||||
|
||||
transceivers = {n.uid: n for n in network.nodes() if isinstance(n, Transceiver)}
|
||||
|
||||
if not transceivers:
|
||||
sys.exit('Network has no transceivers!')
|
||||
if len(transceivers) < 2:
|
||||
sys.exit('Network has only one transceiver!')
|
||||
|
||||
if args.list_nodes:
|
||||
for uid in transceivers:
|
||||
print(uid)
|
||||
sys.exit()
|
||||
|
||||
# First try to find exact match if source/destination provided
|
||||
if args.source:
|
||||
source = transceivers.pop(args.source, None)
|
||||
valid_source = True if source else False
|
||||
else:
|
||||
source = None
|
||||
_logger.info('No source node specified: picking random transceiver')
|
||||
|
||||
if args.destination:
|
||||
destination = transceivers.pop(args.destination, None)
|
||||
valid_destination = True if destination else False
|
||||
else:
|
||||
destination = None
|
||||
_logger.info('No destination node specified: picking random transceiver')
|
||||
|
||||
# If no exact match try to find partial match
|
||||
if args.source and not source:
|
||||
# TODO code a more advanced regex to find nodes match
|
||||
source = next((transceivers.pop(uid) for uid in transceivers
|
||||
if args.source.lower() in uid.lower()), None)
|
||||
|
||||
if args.destination and not destination:
|
||||
# TODO code a more advanced regex to find nodes match
|
||||
destination = next((transceivers.pop(uid) for uid in transceivers
|
||||
if args.destination.lower() in uid.lower()), None)
|
||||
|
||||
# If no partial match or no source/destination provided pick random
|
||||
if not source:
|
||||
source = list(transceivers.values())[0]
|
||||
del transceivers[source.uid]
|
||||
|
||||
if not destination:
|
||||
destination = list(transceivers.values())[0]
|
||||
|
||||
_logger.info(f'source = {args.source!r}')
|
||||
_logger.info(f'destination = {args.destination!r}')
|
||||
|
||||
params = {}
|
||||
params['request_id'] = 0
|
||||
params['trx_type'] = ''
|
||||
params['trx_mode'] = ''
|
||||
params['source'] = source.uid
|
||||
params['destination'] = destination.uid
|
||||
params['bidir'] = False
|
||||
params['nodes_list'] = [destination.uid]
|
||||
params['loose_list'] = ['strict']
|
||||
params['format'] = ''
|
||||
params['path_bandwidth'] = 0
|
||||
trx_params = trx_mode_params(equipment)
|
||||
if args.power:
|
||||
trx_params['power'] = db2lin(float(args.power)) * 1e-3
|
||||
params.update(trx_params)
|
||||
req = PathRequest(**params)
|
||||
|
||||
power_mode = equipment['Span']['default'].power_mode
|
||||
print('\n'.join([f'Power mode is set to {power_mode}',
|
||||
f'=> it can be modified in eqpt_config.json - Span']))
|
||||
|
||||
pref_ch_db = lin2db(req.power * 1e3) # reference channel power / span (SL=20dB)
|
||||
pref_total_db = pref_ch_db + lin2db(req.nb_channel) # reference total power / span (SL=20dB)
|
||||
build_network(network, equipment, pref_ch_db, pref_total_db)
|
||||
path = compute_constrained_path(network, req)
|
||||
|
||||
spans = [s.params.length for s in path if isinstance(s, RamanFiber) or isinstance(s, Fiber)]
|
||||
print(f'\nThere are {len(spans)} fiber spans over {sum(spans)/1000:.0f} km between {source.uid} '
|
||||
f'and {destination.uid}')
|
||||
print(f'\nNow propagating between {source.uid} and {destination.uid}:')
|
||||
|
||||
try:
|
||||
p_start, p_stop, p_step = equipment['SI']['default'].power_range_db
|
||||
p_num = abs(int(round((p_stop - p_start) / p_step))) + 1 if p_step != 0 else 1
|
||||
power_range = list(linspace(p_start, p_stop, p_num))
|
||||
except TypeError:
|
||||
print('invalid power range definition in eqpt_config, should be power_range_db: [lower, upper, step]')
|
||||
power_range = [0]
|
||||
|
||||
if not power_mode:
|
||||
# power cannot be changed in gain mode
|
||||
power_range = [0]
|
||||
for dp_db in power_range:
|
||||
req.power = db2lin(pref_ch_db + dp_db) * 1e-3
|
||||
if power_mode:
|
||||
print(f'\nPropagating with input power = {ansi_escapes.cyan}{lin2db(req.power*1e3):.2f} dBm{ansi_escapes.reset}:')
|
||||
else:
|
||||
print(f'\nPropagating in {ansi_escapes.cyan}gain mode{ansi_escapes.reset}: power cannot be set manually')
|
||||
infos = propagate2(path, req, equipment)
|
||||
if len(power_range) == 1:
|
||||
for elem in path:
|
||||
print(elem)
|
||||
if power_mode:
|
||||
print(f'\nTransmission result for input power = {lin2db(req.power*1e3):.2f} dBm:')
|
||||
else:
|
||||
print(f'\nTransmission results:')
|
||||
print(f' Final SNR total (0.1 nm): {ansi_escapes.cyan}{mean(destination.snr_01nm):.02f} dB{ansi_escapes.reset}')
|
||||
else:
|
||||
print(path[-1])
|
||||
|
||||
# print(f'\n !!!!!!!!!!!!!!!!! TEST POINT !!!!!!!!!!!!!!!!!!!!!')
|
||||
# print(f'carriers ase output of {path[1]} =\n {list(path[1].carriers("out", "nli"))}')
|
||||
# => use "in" or "out" parameter
|
||||
# => use "nli" or "ase" or "signal" or "total" parameter
|
||||
|
||||
if args.save_network is not None:
|
||||
save_network(network, args.save_network)
|
||||
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
|
||||
|
||||
if args.show_channels:
|
||||
print('\nThe total SNR per channel at the end of the line is:')
|
||||
print(
|
||||
'{:>5}{:>26}{:>26}{:>28}{:>28}{:>28}' .format(
|
||||
'Ch. #',
|
||||
'Channel frequency (THz)',
|
||||
'Channel power (dBm)',
|
||||
'OSNR ASE (signal bw, dB)',
|
||||
'SNR NLI (signal bw, dB)',
|
||||
'SNR total (signal bw, dB)'))
|
||||
for final_carrier, ch_osnr, ch_snr_nl, ch_snr in zip(
|
||||
infos[path[-1]][1].carriers, path[-1].osnr_ase, path[-1].osnr_nli, path[-1].snr):
|
||||
ch_freq = final_carrier.frequency * 1e-12
|
||||
ch_power = lin2db(final_carrier.power.signal * 1e3)
|
||||
print(
|
||||
'{:5}{:26.2f}{:26.2f}{:28.2f}{:28.2f}{:28.2f}' .format(
|
||||
final_carrier.channel_number, round(
|
||||
ch_freq, 2), round(
|
||||
ch_power, 2), round(
|
||||
ch_osnr, 2), round(
|
||||
ch_snr_nl, 2), round(
|
||||
ch_snr, 2)))
|
||||
|
||||
if not args.source:
|
||||
print(f'\n(No source node specified: picked {source.uid})')
|
||||
elif not valid_source:
|
||||
print(f'\n(Invalid source node {args.source!r} replaced with {source.uid})')
|
||||
|
||||
if not args.destination:
|
||||
print(f'\n(No destination node specified: picked {destination.uid})')
|
||||
elif not valid_destination:
|
||||
print(f'\n(Invalid destination node {args.destination!r} replaced with {destination.uid})')
|
||||
|
||||
if args.plot:
|
||||
plot_results(network, path, source, destination, infos)
|
||||
|
||||
|
||||
def _path_result_json(pathresult):
|
||||
return {'response': [n.json for n in pathresult]}
|
||||
|
||||
|
||||
def path_requests_run(args=None):
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Compute performance for a list of services provided in a json file or an excel sheet',
|
||||
epilog=_help_footer,
|
||||
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
|
||||
)
|
||||
_add_common_options(parser, network_default=_examples_dir / 'meshTopologyExampleV2.xls')
|
||||
parser.add_argument('service_filename', nargs='?', type=Path, metavar='SERVICES-REQUESTS.(json|xls|xlsx)',
|
||||
default=_examples_dir / 'meshTopologyExampleV2.xls',
|
||||
help='Input service file')
|
||||
parser.add_argument('-bi', '--bidir', action='store_true',
|
||||
help='considers that all demands are bidir')
|
||||
parser.add_argument('-o', '--output', type=Path, metavar=_help_fname_json_csv,
|
||||
help='Store satisifed requests into a JSON or CSV file')
|
||||
|
||||
args = parser.parse_args(args if args is not None else sys.argv[1:])
|
||||
_setup_logging(args)
|
||||
|
||||
_logger.info(f'Computing path requests {args.service_filename} into JSON format')
|
||||
print(f'{ansi_escapes.blue}Computing path requests {os.path.relpath(args.service_filename)} into JSON format{ansi_escapes.reset}')
|
||||
|
||||
(equipment, network) = load_common_data(args.equipment, args.topology, args.sim_params, args.save_network_before_autodesign)
|
||||
|
||||
# Build the network once using the default power defined in SI in eqpt config
|
||||
# TODO power density: db2linp(ower_dbm": 0)/power_dbm": 0 * nb channels as defined by
|
||||
# spacing, f_min and f_max
|
||||
p_db = equipment['SI']['default'].power_dbm
|
||||
|
||||
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
if args.save_network is not None:
|
||||
save_network(network, args.save_network)
|
||||
print(f'{ansi_escapes.blue}Network (after autodesign) saved to {args.save_network}{ansi_escapes.reset}')
|
||||
oms_list = build_oms_list(network, equipment)
|
||||
|
||||
try:
|
||||
data = load_requests(args.service_filename, equipment, bidir=args.bidir,
|
||||
network=network, network_filename=args.topology)
|
||||
rqs = requests_from_json(data, equipment)
|
||||
except exceptions.ServiceError as e:
|
||||
print(f'{ansi_escapes.red}Service error:{ansi_escapes.reset} {e}')
|
||||
sys.exit(1)
|
||||
# check that request ids are unique. Non unique ids, may
|
||||
# mess the computation: better to stop the computation
|
||||
all_ids = [r.request_id for r in rqs]
|
||||
if len(all_ids) != len(set(all_ids)):
|
||||
for item in list(set(all_ids)):
|
||||
all_ids.remove(item)
|
||||
msg = f'Requests id {all_ids} are not unique'
|
||||
_logger.critical(msg)
|
||||
sys.exit()
|
||||
rqs = correct_json_route_list(network, rqs)
|
||||
|
||||
# pths = compute_path(network, equipment, rqs)
|
||||
dsjn = disjunctions_from_json(data)
|
||||
|
||||
print(f'{ansi_escapes.blue}List of disjunctions{ansi_escapes.reset}')
|
||||
print(dsjn)
|
||||
# need to warn or correct in case of wrong disjunction form
|
||||
# disjunction must not be repeated with same or different ids
|
||||
dsjn = deduplicate_disjunctions(dsjn)
|
||||
|
||||
# Aggregate demands with same exact constraints
|
||||
print(f'{ansi_escapes.blue}Aggregating similar requests{ansi_escapes.reset}')
|
||||
|
||||
rqs, dsjn = requests_aggregation(rqs, dsjn)
|
||||
# TODO export novel set of aggregated demands in a json file
|
||||
|
||||
print(f'{ansi_escapes.blue}The following services have been requested:{ansi_escapes.reset}')
|
||||
print(rqs)
|
||||
|
||||
print(f'{ansi_escapes.blue}Computing all paths with constraints{ansi_escapes.reset}')
|
||||
try:
|
||||
pths = compute_path_dsjctn(network, equipment, rqs, dsjn)
|
||||
except exceptions.DisjunctionError as this_e:
|
||||
print(f'{ansi_escapes.red}Disjunction error:{ansi_escapes.reset} {this_e}')
|
||||
sys.exit(1)
|
||||
|
||||
print(f'{ansi_escapes.blue}Propagating on selected path{ansi_escapes.reset}')
|
||||
propagatedpths, reversed_pths, reversed_propagatedpths = compute_path_with_disjunction(network, equipment, rqs, pths)
|
||||
# Note that deepcopy used in compute_path_with_disjunction returns
|
||||
# a list of nodes which are not belonging to network (they are copies of the node objects).
|
||||
# so there can not be propagation on these nodes.
|
||||
|
||||
pth_assign_spectrum(pths, rqs, oms_list, reversed_pths)
|
||||
|
||||
print(f'{ansi_escapes.blue}Result summary{ansi_escapes.reset}')
|
||||
header = ['req id', ' demand', ' snr@bandwidth A-Z (Z-A)', ' snr@0.1nm A-Z (Z-A)',
|
||||
' Receiver minOSNR', ' mode', ' Gbit/s', ' nb of tsp pairs',
|
||||
'N,M or blocking reason']
|
||||
data = []
|
||||
data.append(header)
|
||||
for i, this_p in enumerate(propagatedpths):
|
||||
rev_pth = reversed_propagatedpths[i]
|
||||
if rev_pth and this_p:
|
||||
psnrb = f'{round(mean(this_p[-1].snr),2)} ({round(mean(rev_pth[-1].snr),2)})'
|
||||
psnr = f'{round(mean(this_p[-1].snr_01nm), 2)}' +\
|
||||
f' ({round(mean(rev_pth[-1].snr_01nm),2)})'
|
||||
elif this_p:
|
||||
psnrb = f'{round(mean(this_p[-1].snr),2)}'
|
||||
psnr = f'{round(mean(this_p[-1].snr_01nm),2)}'
|
||||
|
||||
try:
|
||||
if rqs[i].blocking_reason in BLOCKING_NOPATH:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} :',
|
||||
f'-', f'-', f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
|
||||
f'-', f'{rqs[i].blocking_reason}']
|
||||
else:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
|
||||
psnr, f'-', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9, 2)}',
|
||||
f'-', f'{rqs[i].blocking_reason}']
|
||||
except AttributeError:
|
||||
line = [f'{rqs[i].request_id}', f' {rqs[i].source} to {rqs[i].destination} : ', psnrb,
|
||||
psnr, f'{rqs[i].OSNR}', f'{rqs[i].tsp_mode}', f'{round(rqs[i].path_bandwidth * 1e-9,2)}',
|
||||
f'{ceil(rqs[i].path_bandwidth / rqs[i].bit_rate) }', f'({rqs[i].N},{rqs[i].M})']
|
||||
data.append(line)
|
||||
|
||||
col_width = max(len(word) for row in data for word in row[2:]) # padding
|
||||
firstcol_width = max(len(row[0]) for row in data) # padding
|
||||
secondcol_width = max(len(row[1]) for row in data) # padding
|
||||
for row in data:
|
||||
firstcol = ''.join(row[0].ljust(firstcol_width))
|
||||
secondcol = ''.join(row[1].ljust(secondcol_width))
|
||||
remainingcols = ''.join(word.center(col_width, ' ') for word in row[2:])
|
||||
print(f'{firstcol} {secondcol} {remainingcols}')
|
||||
print(f'{ansi_escapes.yellow}Result summary shows mean SNR and OSNR (average over all channels){ansi_escapes.reset}')
|
||||
|
||||
if args.output:
|
||||
result = []
|
||||
# assumes that list of rqs and list of propgatedpths have same order
|
||||
for i, pth in enumerate(propagatedpths):
|
||||
result.append(ResultElement(rqs[i], pth, reversed_propagatedpths[i]))
|
||||
temp = _path_result_json(result)
|
||||
if args.output.suffix.lower() == '.json':
|
||||
save_json(temp, args.output)
|
||||
print(f'{ansi_escapes.blue}Saved JSON to {args.output}{ansi_escapes.reset}')
|
||||
elif args.output.suffix.lower() == '.csv':
|
||||
with open(args.output, "w", encoding='utf-8') as fcsv:
|
||||
jsontocsv(temp, equipment, fcsv)
|
||||
print(f'{ansi_escapes.blue}Saved CSV to {args.output}{ansi_escapes.reset}')
|
||||
else:
|
||||
print(f'{ansi_escapes.red}Cannot save output: neither JSON nor CSV file{ansi_escapes.reset}')
|
||||
sys.exit(1)
|
||||
|
||||
@@ -234,7 +234,7 @@ def sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city):
|
||||
return nodes, links
|
||||
|
||||
|
||||
def convert_file(input_filename, names_matching=False, filter_region=[]):
|
||||
def xls_to_json_data(input_filename, filter_region=[]):
|
||||
nodes, links, eqpts = parse_excel(input_filename)
|
||||
if filter_region:
|
||||
nodes = [n for n in nodes if n.region.lower() in filter_region]
|
||||
@@ -260,7 +260,7 @@ def convert_file(input_filename, names_matching=False, filter_region=[]):
|
||||
|
||||
nodes, links = sanity_check(nodes, links, nodes_by_city, links_by_city, eqpts_by_city)
|
||||
|
||||
data = {
|
||||
return {
|
||||
'elements':
|
||||
[{'uid': f'trx {x.city}',
|
||||
'metadata': {'location': {'city': x.city,
|
||||
@@ -392,10 +392,11 @@ def convert_file(input_filename, names_matching=False, filter_region=[]):
|
||||
for x in nodes_by_city.values() if x.node_type.lower() == 'roadm'])))
|
||||
}
|
||||
|
||||
suffix_filename = str(input_filename.suffixes[0])
|
||||
full_input_filename = str(input_filename)
|
||||
split_filename = [full_input_filename[0:len(full_input_filename) - len(suffix_filename)], suffix_filename[1:]]
|
||||
output_json_file_name = split_filename[0] + '.json'
|
||||
|
||||
def convert_file(input_filename, filter_region=[], output_json_file_name=None):
|
||||
data = xls_to_json_data(input_filename, filter_region)
|
||||
if output_json_file_name is None:
|
||||
output_json_file_name = input_filename.with_suffix('.json')
|
||||
with open(output_json_file_name, 'w', encoding='utf-8') as edfa_json_file:
|
||||
edfa_json_file.write(dumps(data, indent=2, ensure_ascii=False))
|
||||
return output_json_file_name
|
||||
@@ -728,10 +729,17 @@ LINKS_COLUMN = 16
|
||||
LINKS_LINE = 3
|
||||
EQPTS_LINE = 3
|
||||
EQPTS_COLUMN = 14
|
||||
parser = ArgumentParser()
|
||||
parser.add_argument('workbook', nargs='?', type=Path, default='meshTopologyExampleV2.xls')
|
||||
parser.add_argument('-f', '--filter-region', action='append', default=[])
|
||||
|
||||
|
||||
def _do_convert():
|
||||
parser = ArgumentParser()
|
||||
parser.add_argument('workbook', type=Path)
|
||||
parser.add_argument('-f', '--filter-region', action='append', default=[])
|
||||
parser.add_argument('--output', type=Path, help='Name of the generated JSON file')
|
||||
args = parser.parse_args()
|
||||
res = convert_file(args.workbook, args.filter_region, args.output)
|
||||
print(f'XLS -> JSON saved to {res}')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
args = parser.parse_args()
|
||||
convert_file(args.workbook, args.filter_region)
|
||||
_do_convert()
|
||||
|
||||
@@ -10,7 +10,6 @@ Loading and saving data from JSON files in GNPy's internal data format
|
||||
|
||||
from networkx import DiGraph
|
||||
from logging import getLogger
|
||||
from os import path
|
||||
from pathlib import Path
|
||||
import json
|
||||
from collections import namedtuple
|
||||
@@ -20,8 +19,8 @@ from gnpy.core.exceptions import ConfigurationError, EquipmentConfigError, Netwo
|
||||
from gnpy.core.science_utils import estimate_nf_model
|
||||
from gnpy.core.utils import automatic_nch, automatic_fmax, merge_amplifier_restrictions
|
||||
from gnpy.topology.request import PathRequest, Disjunction
|
||||
from gnpy.tools.convert import convert_file
|
||||
from gnpy.tools.service_sheet import convert_service_sheet
|
||||
from gnpy.tools.convert import xls_to_json_data
|
||||
from gnpy.tools.service_sheet import read_service_sheet
|
||||
import time
|
||||
|
||||
|
||||
@@ -209,8 +208,7 @@ class Amp(_JsonThing):
|
||||
raise EquipmentConfigError(f'missing preamp/booster variety input for amplifier: {type_variety} in equipment config')
|
||||
dual_stage_def = Model_dual_stage(preamp_variety, booster_variety)
|
||||
|
||||
with open(config, encoding='utf-8') as f:
|
||||
json_data = json.load(f)
|
||||
json_data = load_json(config)
|
||||
|
||||
return cls(**{**kwargs, **json_data,
|
||||
'nf_model': nf_def, 'dual_stage_model': dual_stage_def})
|
||||
@@ -303,23 +301,23 @@ def _equipment_from_json(json_data, filename):
|
||||
return equipment
|
||||
|
||||
|
||||
def load_network(filename, equipment, name_matching=False):
|
||||
json_filename = ''
|
||||
def load_network(filename, equipment):
|
||||
if filename.suffix.lower() in ('.xls', '.xlsx'):
|
||||
_logger.info('Automatically generating topology JSON file')
|
||||
json_filename = convert_file(filename, name_matching)
|
||||
json_data = xls_to_json_data(filename)
|
||||
elif filename.suffix.lower() == '.json':
|
||||
json_filename = filename
|
||||
json_data = load_json(filename)
|
||||
else:
|
||||
raise ValueError(f'unsuported topology filename extension {filename.suffix.lower()}')
|
||||
json_data = load_json(json_filename)
|
||||
raise ValueError(f'unsupported topology filename extension {filename.suffix.lower()}')
|
||||
return network_from_json(json_data, equipment)
|
||||
|
||||
|
||||
def save_network(filename, network):
|
||||
filename_output = path.splitext(filename)[0] + '_auto_design.json'
|
||||
json_data = network_to_json(network)
|
||||
save_json(json_data, filename_output)
|
||||
def save_network(network: DiGraph, filename: str):
|
||||
'''Dump the network into a JSON file
|
||||
|
||||
:param network: network to work on
|
||||
:param filename: file to write to
|
||||
'''
|
||||
save_json(network_to_json(network), filename)
|
||||
|
||||
|
||||
def _cls_for(equipment_type):
|
||||
@@ -518,3 +516,18 @@ def disjunctions_from_json(json_data):
|
||||
disjunctions_list.append(Disjunction(**params))
|
||||
|
||||
return disjunctions_list
|
||||
|
||||
|
||||
def convert_service_sheet(
|
||||
input_filename,
|
||||
eqpt,
|
||||
network,
|
||||
network_filename=None,
|
||||
output_filename='',
|
||||
bidir=False,
|
||||
filter_region=None):
|
||||
if output_filename == '':
|
||||
output_filename = f'{str(input_filename)[0:len(str(input_filename))-len(str(input_filename.suffixes[0]))]}_services.json'
|
||||
data = read_service_sheet(input_filename, eqpt, network, network_filename, bidir, filter_region)
|
||||
save_json(data, output_filename)
|
||||
return data
|
||||
|
||||
@@ -11,11 +11,9 @@ Yang model for requesting path computation.
|
||||
See: draft-ietf-teas-yang-path-computation-01.txt
|
||||
"""
|
||||
|
||||
from sys import exit
|
||||
from xlrd import open_workbook, XL_CELL_EMPTY
|
||||
from collections import namedtuple
|
||||
from logging import getLogger
|
||||
from json import dumps
|
||||
from copy import deepcopy
|
||||
from gnpy.core.utils import db2lin
|
||||
from gnpy.core.exceptions import ServiceError
|
||||
@@ -24,7 +22,6 @@ import gnpy.core.ansi_escapes as ansi_escapes
|
||||
from gnpy.tools.convert import corresp_names, corresp_next_node
|
||||
|
||||
SERVICES_COLUMN = 12
|
||||
#EQPT_LIBRARY_FILENAME = Path(__file__).parent / 'eqpt_config.json'
|
||||
|
||||
|
||||
def all_rows(sheet, start=0):
|
||||
@@ -33,16 +30,12 @@ def all_rows(sheet, start=0):
|
||||
|
||||
logger = getLogger(__name__)
|
||||
|
||||
# Type for input data
|
||||
|
||||
|
||||
class Request(namedtuple('Request', 'request_id source destination trx_type mode \
|
||||
spacing power nb_channel disjoint_from nodes_list is_loose path_bandwidth')):
|
||||
def __new__(cls, request_id, source, destination, trx_type, mode=None, spacing=None, power=None, nb_channel=None, disjoint_from='', nodes_list=None, is_loose='', path_bandwidth=None):
|
||||
return super().__new__(cls, request_id, source, destination, trx_type, mode, spacing, power, nb_channel, disjoint_from, nodes_list, is_loose, path_bandwidth)
|
||||
|
||||
# Type for output data: // from dutc
|
||||
|
||||
|
||||
class Element:
|
||||
def __eq__(self, other):
|
||||
@@ -180,12 +173,11 @@ class Request_element(Element):
|
||||
return self.pathrequest, self.pathsync
|
||||
|
||||
|
||||
def convert_service_sheet(
|
||||
def read_service_sheet(
|
||||
input_filename,
|
||||
eqpt,
|
||||
network,
|
||||
network_filename=None,
|
||||
output_filename='',
|
||||
bidir=False,
|
||||
filter_region=None):
|
||||
""" converts a service sheet into a json structure
|
||||
@@ -197,12 +189,6 @@ def convert_service_sheet(
|
||||
service = parse_excel(input_filename)
|
||||
req = [Request_element(n, eqpt, bidir) for n in service]
|
||||
req = correct_xls_route_list(network_filename, network, req)
|
||||
# dumps the output into a json file with name
|
||||
# split_filename = [input_filename[0:len(input_filename)-len(suffix_filename)] , suffix_filename[1:]]
|
||||
if output_filename == '':
|
||||
output_filename = f'{str(input_filename)[0:len(str(input_filename))-len(str(input_filename.suffixes[0]))]}_services.json'
|
||||
# for debug
|
||||
# print(json_filename)
|
||||
# if there is no sync vector , do not write any synchronization
|
||||
synchro = [n.json[1] for n in req if n.json[1] is not None]
|
||||
if synchro:
|
||||
@@ -214,8 +200,6 @@ def convert_service_sheet(
|
||||
data = {
|
||||
'path-request': [n.json[0] for n in req]
|
||||
}
|
||||
with open(output_filename, 'w', encoding='utf-8') as f:
|
||||
f.write(dumps(data, indent=2, ensure_ascii=False))
|
||||
return data
|
||||
|
||||
|
||||
@@ -228,13 +212,10 @@ def correct_xlrd_int_to_str_reading(v):
|
||||
value = v
|
||||
return value
|
||||
|
||||
# to be used from dutc
|
||||
|
||||
|
||||
def parse_row(row, fieldnames):
|
||||
return {f: r.value for f, r in zip(fieldnames, row[0:SERVICES_COLUMN])
|
||||
if r.ctype != XL_CELL_EMPTY}
|
||||
#
|
||||
|
||||
|
||||
def parse_excel(input_filename):
|
||||
|
||||
@@ -7,7 +7,7 @@ Equipment and Network description definitions
|
||||
|
||||
Equipment description defines equipment types and those parameters.
|
||||
Description is made in JSON file with predefined structure. By default
|
||||
**transmission_main_example.py** uses **eqpt_config.json** file and that
|
||||
**gnpy-transmission-example** uses **eqpt_config.json** file and that
|
||||
can be changed with **-e** or **--equipment** command line parameter.
|
||||
Parsing of JSON file is made with
|
||||
**gnpy.core.equipment.load_equipment(equipment_description)** and return
|
||||
@@ -82,7 +82,7 @@ it will be marked with **”default”** value.
|
||||
*******************
|
||||
|
||||
Four types of EDFA definition are possible. Description JSON file
|
||||
location is in **transmission_main_example.py** folder:
|
||||
location is in **gnpy-transmission-example** folder:
|
||||
|
||||
- Advanced – with JSON file describing gain/noise figure tilt and
|
||||
gain/noise figure ripple. **"advanced_config_from_json"** value
|
||||
@@ -314,7 +314,7 @@ Note that ``OSNR`` parameter refers to the receiver's minimal OSNR threshold for
|
||||
Network description defines network elements with additional to
|
||||
equipment description parameters, metadata and elements interconnection.
|
||||
Description is made in JSON file with predefined structure. By default
|
||||
**transmission_main_example.py** uses **edfa_example_network.json** file
|
||||
**gnpy-transmission-example** uses **edfa_example_network.json** file
|
||||
and can be changed from command line. Parsing of JSON file is made with
|
||||
**gnpy.core.network.load_network(network_description,
|
||||
equipment_description)** and return value is **DiGraph** object which
|
||||
|
||||
@@ -41,4 +41,10 @@ packages = gnpy
|
||||
data_files =
|
||||
examples = examples/*
|
||||
# FIXME: solve example data files
|
||||
# FIXME: add example scripts ("entry points")
|
||||
|
||||
[options.entry_points]
|
||||
console_scripts =
|
||||
gnpy-example-data = gnpy.tools.cli_examples:show_example_data_dir
|
||||
gnpy-transmission-example = gnpy.tools.cli_examples:transmission_main_example
|
||||
gnpy-path-request = gnpy.tools.cli_examples:path_requests_run
|
||||
gnpy-convert-xls = gnpy.tools.convert:_do_convert
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
#!/usr/bin/env python3
|
||||
from json import load, dump
|
||||
from json import dump
|
||||
from pathlib import Path
|
||||
from argparse import ArgumentParser
|
||||
from collections import namedtuple
|
||||
from gnpy.tools.json_io import load_json
|
||||
|
||||
|
||||
class Results(namedtuple('Results', 'missing extra different expected actual')):
|
||||
@@ -122,12 +123,8 @@ def encode_sets(obj):
|
||||
|
||||
if __name__ == '__main__':
|
||||
args = parser.parse_args()
|
||||
|
||||
with open(args.expected_output, encoding='utf-8') as f:
|
||||
expected = load(f)
|
||||
|
||||
with open(args.actual_output, encoding='utf-8') as f:
|
||||
actual = load(f)
|
||||
expected = load_json(args.expected_output)
|
||||
actual = load_json(args.actual_output)
|
||||
|
||||
result = COMPARISONS[args.comparison](expected, actual)
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
[1;34;40mComputing path requests examples/meshTopologyExampleV2.xls into JSON format[0m
|
||||
[1;34;40mComputing path requests gnpy/example-data/meshTopologyExampleV2.xls into JSON format[0m
|
||||
[1;34;40mList of disjunctions[0m
|
||||
[Disjunction 3
|
||||
relaxable: false
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
#!/usr/bin/env python3
|
||||
# TelecomInfraProject/gnpy/examples
|
||||
# Module name : test_automaticmodefeature.py
|
||||
# Version :
|
||||
# License : BSD 3-Clause Licence
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
#!/usr/bin/env python3
|
||||
# TelecomInfraProject/gnpy/examples
|
||||
# Module name : test_disjunction.py
|
||||
# Version:
|
||||
# License: BSD 3-Clause Licence
|
||||
|
||||
@@ -4,23 +4,40 @@ from pathlib import Path
|
||||
import os
|
||||
import pytest
|
||||
import subprocess
|
||||
from gnpy.tools.cli_examples import transmission_main_example, path_requests_run
|
||||
|
||||
SRC_ROOT = Path(__file__).parent.parent
|
||||
|
||||
|
||||
@pytest.mark.parametrize("output, invocation", (
|
||||
('transmission_main_example',
|
||||
('./examples/transmission_main_example.py',)),
|
||||
('path_requests_run',
|
||||
('./examples/path_requests_run.py',)),
|
||||
('transmission_main_example__raman',
|
||||
('./examples/transmission_main_example.py', 'examples/raman_edfa_example_network.json',
|
||||
'--sim', 'examples/sim_params.json', '--show-channels',)),
|
||||
@pytest.mark.parametrize("output, handler, args", (
|
||||
('transmission_main_example', transmission_main_example, []),
|
||||
('path_requests_run', path_requests_run, []),
|
||||
('transmission_main_example__raman', transmission_main_example,
|
||||
['gnpy/example-data/raman_edfa_example_network.json', '--sim', 'gnpy/example-data/sim_params.json', '--show-channels', ]),
|
||||
))
|
||||
def test_example_invocation(output, invocation):
|
||||
def test_example_invocation(capfdbinary, output, handler, args):
|
||||
'''Make sure that our examples produce useful output'''
|
||||
os.chdir(SRC_ROOT)
|
||||
expected = open(SRC_ROOT / 'tests' / 'invocation' / output, mode='rb').read()
|
||||
proc = subprocess.run(invocation, stdout=subprocess.PIPE, stderr=subprocess.PIPE, check=True)
|
||||
assert proc.stderr == b''
|
||||
assert proc.stdout == expected
|
||||
handler(args)
|
||||
captured = capfdbinary.readouterr()
|
||||
assert captured.out == expected
|
||||
assert captured.err == b''
|
||||
|
||||
|
||||
@pytest.mark.parametrize('program', ('gnpy-transmission-example', 'gnpy-path-request'))
|
||||
def test_run_wrapper(program):
|
||||
'''Ensure that our wrappers really, really work'''
|
||||
proc = subprocess.run((program, '--help'), stdout=subprocess.PIPE, stderr=subprocess.PIPE,
|
||||
check=True, universal_newlines=True)
|
||||
assert proc.stderr == ''
|
||||
assert 'https://github.com/telecominfraproject/oopt-gnpy' in proc.stdout.lower()
|
||||
assert 'https://gnpy.readthedocs.io/' in proc.stdout.lower()
|
||||
|
||||
|
||||
def test_conversion_xls():
|
||||
proc = subprocess.run(
|
||||
('gnpy-convert-xls', SRC_ROOT / 'tests' / 'data' / 'testTopology.xls', '--output', '/dev/null'),
|
||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE, check=True, universal_newlines=True)
|
||||
assert proc.stderr == ''
|
||||
assert '/dev/null' in proc.stdout
|
||||
|
||||
@@ -1,19 +1,18 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
from gnpy.core.parameters import SimParams
|
||||
from gnpy.core.science_utils import Simulation
|
||||
from gnpy.tools.json_io import load_json
|
||||
|
||||
TEST_DIR = Path(__file__).parent
|
||||
DATA_DIR = TEST_DIR / 'data'
|
||||
|
||||
|
||||
def test_sim_parameters():
|
||||
f = open(DATA_DIR / 'test_sim_params.json')
|
||||
j = json.load(f)
|
||||
j = load_json(DATA_DIR / 'test_sim_params.json')
|
||||
sim_params = SimParams(**j)
|
||||
Simulation.set_params(sim_params)
|
||||
s1 = Simulation.get_simulation()
|
||||
|
||||
@@ -15,9 +15,9 @@
|
||||
- writing of results in json (same keys)
|
||||
"""
|
||||
|
||||
from json import load
|
||||
from pathlib import Path
|
||||
from os import unlink
|
||||
import shutil
|
||||
from pandas import read_csv
|
||||
import pytest
|
||||
from tests.compare import compare_networks, compare_services
|
||||
@@ -29,8 +29,8 @@ from gnpy.topology.request import (jsontocsv, requests_aggregation, compute_path
|
||||
compute_path_with_disjunction, ResultElement, PathRequest)
|
||||
from gnpy.topology.spectrum_assignment import build_oms_list, pth_assign_spectrum
|
||||
from gnpy.tools.convert import convert_file
|
||||
from gnpy.tools.json_io import load_network, save_network, load_equipment, requests_from_json, disjunctions_from_json
|
||||
from gnpy.tools.service_sheet import convert_service_sheet, correct_xls_route_list
|
||||
from gnpy.tools.json_io import load_json, load_network, save_network, load_equipment, requests_from_json, disjunctions_from_json
|
||||
from gnpy.tools.service_sheet import read_service_sheet, correct_xls_route_list
|
||||
|
||||
TEST_DIR = Path(__file__).parent
|
||||
DATA_DIR = TEST_DIR / 'data'
|
||||
@@ -42,18 +42,17 @@ equipment = load_equipment(eqpt_filename)
|
||||
DATA_DIR / 'CORONET_Global_Topology.xlsx': DATA_DIR / 'CORONET_Global_Topology_expected.json',
|
||||
DATA_DIR / 'testTopology.xls': DATA_DIR / 'testTopology_expected.json',
|
||||
}.items())
|
||||
def test_excel_json_generation(xls_input, expected_json_output):
|
||||
def test_excel_json_generation(tmpdir, xls_input, expected_json_output):
|
||||
""" tests generation of topology json
|
||||
"""
|
||||
convert_file(xls_input)
|
||||
xls_copy = Path(tmpdir) / xls_input.name
|
||||
shutil.copyfile(xls_input, xls_copy)
|
||||
convert_file(xls_copy)
|
||||
|
||||
actual_json_output = xls_input.with_suffix('.json')
|
||||
with open(actual_json_output, encoding='utf-8') as f:
|
||||
actual = load(f)
|
||||
actual_json_output = xls_copy.with_suffix('.json')
|
||||
actual = load_json(actual_json_output)
|
||||
unlink(actual_json_output)
|
||||
|
||||
with open(expected_json_output, encoding='utf-8') as f:
|
||||
expected = load(f)
|
||||
expected = load_json(expected_json_output)
|
||||
|
||||
results = compare_networks(expected, actual)
|
||||
assert not results.elements.missing
|
||||
@@ -73,7 +72,7 @@ def test_excel_json_generation(xls_input, expected_json_output):
|
||||
DATA_DIR / 'testTopology.xls':
|
||||
DATA_DIR / 'testTopology_auto_design_expected.json',
|
||||
}.items())
|
||||
def test_auto_design_generation_fromxlsgainmode(xls_input, expected_json_output):
|
||||
def test_auto_design_generation_fromxlsgainmode(tmpdir, xls_input, expected_json_output):
|
||||
""" tests generation of topology json
|
||||
test that the build network gives correct results in gain mode
|
||||
"""
|
||||
@@ -88,16 +87,11 @@ def test_auto_design_generation_fromxlsgainmode(xls_input, expected_json_output)
|
||||
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
save_network(xls_input, network)
|
||||
|
||||
actual_json_output = xls_input.with_name(xls_input.stem + '_auto_design').with_suffix('.json')
|
||||
|
||||
with open(actual_json_output, encoding='utf-8') as f:
|
||||
actual = load(f)
|
||||
actual_json_output = tmpdir / xls_input.with_name(xls_input.stem + '_auto_design').with_suffix('.json').name
|
||||
save_network(network, actual_json_output)
|
||||
actual = load_json(actual_json_output)
|
||||
unlink(actual_json_output)
|
||||
|
||||
with open(expected_json_output, encoding='utf-8') as f:
|
||||
expected = load(f)
|
||||
expected = load_json(expected_json_output)
|
||||
|
||||
results = compare_networks(expected, actual)
|
||||
assert not results.elements.missing
|
||||
@@ -116,7 +110,7 @@ def test_auto_design_generation_fromxlsgainmode(xls_input, expected_json_output)
|
||||
DATA_DIR / 'testTopology_auto_design_expected.json':
|
||||
DATA_DIR / 'testTopology_auto_design_expected.json',
|
||||
}.items())
|
||||
def test_auto_design_generation_fromjson(json_input, expected_json_output):
|
||||
def test_auto_design_generation_fromjson(tmpdir, json_input, expected_json_output):
|
||||
"""test that autodesign creates same file as an input file already autodesigned
|
||||
"""
|
||||
equipment = load_equipment(eqpt_filename)
|
||||
@@ -130,16 +124,11 @@ def test_auto_design_generation_fromjson(json_input, expected_json_output):
|
||||
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
save_network(json_input, network)
|
||||
|
||||
actual_json_output = json_input.with_name(json_input.stem + '_auto_design').with_suffix('.json')
|
||||
|
||||
with open(actual_json_output, encoding='utf-8') as f:
|
||||
actual = load(f)
|
||||
actual_json_output = tmpdir / json_input.with_name(json_input.stem + '_auto_design').with_suffix('.json').name
|
||||
save_network(network, actual_json_output)
|
||||
actual = load_json(actual_json_output)
|
||||
unlink(actual_json_output)
|
||||
|
||||
with open(expected_json_output, encoding='utf-8') as f:
|
||||
expected = load(f)
|
||||
expected = load_json(expected_json_output)
|
||||
|
||||
results = compare_networks(expected, actual)
|
||||
assert not results.elements.missing
|
||||
@@ -152,7 +141,7 @@ def test_auto_design_generation_fromjson(json_input, expected_json_output):
|
||||
# test services creation
|
||||
|
||||
|
||||
@pytest.mark.parametrize('xls_input,expected_json_output', {
|
||||
@pytest.mark.parametrize('xls_input, expected_json_output', {
|
||||
DATA_DIR / 'testTopology.xls': DATA_DIR / 'testTopology_services_expected.json',
|
||||
DATA_DIR / 'testService.xls': DATA_DIR / 'testService_services_expected.json'
|
||||
}.items())
|
||||
@@ -166,17 +155,10 @@ def test_excel_service_json_generation(xls_input, expected_json_output):
|
||||
p_total_db = p_db + lin2db(automatic_nch(equipment['SI']['default'].f_min,
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
convert_service_sheet(xls_input, equipment, network, network_filename=DATA_DIR / 'testTopology.xls')
|
||||
from_xls = read_service_sheet(xls_input, equipment, network, network_filename=DATA_DIR / 'testTopology.xls')
|
||||
expected = load_json(expected_json_output)
|
||||
|
||||
actual_json_output = xls_input.with_name(xls_input.stem + '_services').with_suffix('.json')
|
||||
with open(actual_json_output, encoding='utf-8') as f:
|
||||
actual = load(f)
|
||||
unlink(actual_json_output)
|
||||
|
||||
with open(expected_json_output, encoding='utf-8') as f:
|
||||
expected = load(f)
|
||||
|
||||
results = compare_services(expected, actual)
|
||||
results = compare_services(expected, from_xls)
|
||||
assert not results.requests.missing
|
||||
assert not results.requests.extra
|
||||
assert not results.requests.different
|
||||
@@ -189,21 +171,20 @@ def test_excel_service_json_generation(xls_input, expected_json_output):
|
||||
# test xls answers creation
|
||||
|
||||
|
||||
@pytest.mark.parametrize('json_input, csv_output', {
|
||||
DATA_DIR / 'testTopology_response.json': DATA_DIR / 'testTopology_response',
|
||||
}.items())
|
||||
def test_csv_response_generation(json_input, csv_output):
|
||||
@pytest.mark.parametrize('json_input',
|
||||
(DATA_DIR / 'testTopology_response.json', )
|
||||
)
|
||||
def test_csv_response_generation(tmpdir, json_input):
|
||||
""" tests if generated csv is consistant with expected generation
|
||||
same columns (order not important)
|
||||
"""
|
||||
with open(json_input) as jsonfile:
|
||||
json_data = load(jsonfile)
|
||||
json_data = load_json(json_input)
|
||||
equipment = load_equipment(eqpt_filename)
|
||||
csv_filename = str(csv_output) + '.csv'
|
||||
csv_filename = Path(tmpdir / json_input.name).with_suffix('.csv')
|
||||
with open(csv_filename, 'w', encoding='utf-8') as fcsv:
|
||||
jsontocsv(json_data, equipment, fcsv)
|
||||
|
||||
expected_csv_filename = str(csv_output) + '_expected.csv'
|
||||
expected_csv_filename = json_input.parent / (json_input.stem + '_expected.csv')
|
||||
|
||||
# expected header
|
||||
# csv_header = \
|
||||
@@ -301,7 +282,7 @@ def test_json_response_generation(xls_input, expected_response_file):
|
||||
equipment['SI']['default'].f_max, equipment['SI']['default'].spacing))
|
||||
build_network(network, equipment, p_db, p_total_db)
|
||||
|
||||
data = convert_service_sheet(xls_input, equipment, network)
|
||||
data = read_service_sheet(xls_input, equipment, network)
|
||||
# change one of the request with bidir option to cover bidir case as well
|
||||
data['path-request'][2]['bidirectional'] = True
|
||||
|
||||
@@ -333,12 +314,8 @@ def test_json_response_generation(xls_input, expected_response_file):
|
||||
temp = {
|
||||
'response': [n.json for n in result]
|
||||
}
|
||||
# load expected result and compare keys and values
|
||||
|
||||
with open(expected_response_file) as jsonfile:
|
||||
expected = load(jsonfile)
|
||||
# since we changes bidir attribute of request#2, need to add the corresponding
|
||||
# metric in response
|
||||
expected = load_json(expected_response_file)
|
||||
|
||||
for i, response in enumerate(temp['response']):
|
||||
if i == 2:
|
||||
|
||||
@@ -6,7 +6,6 @@ checks that RamanFiber propagates properly the spectral information. In this way
|
||||
are tested.
|
||||
"""
|
||||
|
||||
import json
|
||||
from pandas import read_csv
|
||||
from numpy.testing import assert_allclose
|
||||
from gnpy.core.info import create_input_spectral_information
|
||||
@@ -23,8 +22,7 @@ def test_raman_fiber():
|
||||
"""
|
||||
# spectral information generation
|
||||
power = 1e-3
|
||||
with open(TEST_DIR / 'data' / 'eqpt_config.json', 'r') as file:
|
||||
eqpt_params = json.load(file)
|
||||
eqpt_params = load_json(TEST_DIR / 'data' / 'eqpt_config.json')
|
||||
spectral_info_params = eqpt_params['SI'][0]
|
||||
spectral_info_params.pop('power_dbm')
|
||||
spectral_info_params.pop('power_range_db')
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
#!/usr/bin/env python3
|
||||
# TelecomInfraProject/gnpy/examples
|
||||
# Module name: test_spectrum_assignment.py
|
||||
# Version:
|
||||
# License: BSD 3-Clause Licence
|
||||
|
||||
2
tox.ini
2
tox.ini
@@ -13,7 +13,7 @@ deps =
|
||||
changedir = {toxinidir}
|
||||
usedevelop = True
|
||||
setenv =
|
||||
cover: CI_COVERAGE_OPTS=--cov=gnpy --cov=examples --cov=tests --cov-report=
|
||||
cover: CI_COVERAGE_OPTS=--cov=gnpy --cov=tests --cov-report=
|
||||
commands =
|
||||
pytest {env:CI_COVERAGE_OPTS:} -vv {posargs}
|
||||
cover: coverage html -d cover
|
||||
|
||||
Reference in New Issue
Block a user