merged phase-2 into master

This commit is contained in:
James Powell
2017-11-08 15:07:51 -05:00
42 changed files with 6976 additions and 1914 deletions

View File

@@ -1,21 +0,0 @@
# http://editorconfig.org
root = true
[*]
indent_style = space
indent_size = 4
trim_trailing_whitespace = true
insert_final_newline = true
charset = utf-8
end_of_line = lf
[*.bat]
indent_style = tab
end_of_line = crlf
[LICENSE]
insert_final_newline = false
[Makefile]
indent_style = tab

View File

@@ -1,15 +0,0 @@
* gnpy version:
* Python version:
* Operating System:
### Description
Describe what you were trying to get done.
Tell us what happened, what went wrong, and what you expected to happen.
### What I Did
```
Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.
```

3
.gitignore vendored
View File

@@ -60,3 +60,6 @@ target/
# pyenv python configuration file
.python-version
# MacOS DS_store
.DS_Store

View File

@@ -1,29 +0,0 @@
# Config file for automatic testing at travis-ci.org
# This file will be regenerated if you run travis_pypi_setup.py
language: python
python:
- 3.5
- 3.4
- 3.3
- 2.7
- 2.6
# command to install dependencies, e.g. pip install -r requirements.txt --use-mirrors
install: pip install -U tox-travis
# command to run tests, e.g. python setup.py test
script: tox
# After you create the Github repo and add it to Travis, run the
# travis_pypi_setup.py script to finish PyPI deployment setup
deploy:
provider: pypi
distributions: sdist bdist_wheel
user: <TBD>
password:
secure: PLEASE_REPLACE_ME
on:
tags: true
repo: <TBD>/gnpy
python: 2.7

View File

@@ -1,13 +0,0 @@
=======
Credits
=======
Development Lead
----------------
* <TBD> <<TBD>@<TBD>.com>
Contributors
------------
None yet. Why not be the first?

View File

@@ -1,114 +0,0 @@
.. highlight:: shell
============
Contributing
============
Contributions are welcome, and they are greatly appreciated! Every
little bit helps, and credit will always be given.
You can contribute in many ways:
Types of Contributions
----------------------
Report Bugs
~~~~~~~~~~~
Report bugs at https://github.com/<TBD>/gnpy/issues.
If you are reporting a bug, please include:
* Your operating system name and version.
* Any details about your local setup that might be helpful in troubleshooting.
* Detailed steps to reproduce the bug.
Fix Bugs
~~~~~~~~
Look through the GitHub issues for bugs. Anything tagged with "bug"
and "help wanted" is open to whoever wants to implement it.
Implement Features
~~~~~~~~~~~~~~~~~~
Look through the GitHub issues for features. Anything tagged with "enhancement"
and "help wanted" is open to whoever wants to implement it.
Write Documentation
~~~~~~~~~~~~~~~~~~~
gnpy could always use more documentation, whether as part of the
official gnpy docs, in docstrings, or even on the web in blog posts,
articles, and such.
Submit Feedback
~~~~~~~~~~~~~~~
The best way to send feedback is to file an issue at https://github.com/<TBD>/gnpy/issues.
If you are proposing a feature:
* Explain in detail how it would work.
* Keep the scope as narrow as possible, to make it easier to implement.
* Remember that this is a volunteer-driven project, and that contributions
are welcome :)
Get Started!
------------
Ready to contribute? Here's how to set up `gnpy` for local development.
1. Fork the `gnpy` repo on GitHub.
2. Clone your fork locally::
$ git clone git@github.com:your_name_here/gnpy.git
3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::
$ mkvirtualenv gnpy
$ cd gnpy/
$ python setup.py develop
4. Create a branch for local development::
$ git checkout -b name-of-your-bugfix-or-feature
Now you can make your changes locally.
5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
$ flake8 gnpy tests
$ python setup.py test or py.test
$ tox
To get flake8 and tox, just pip install them into your virtualenv.
6. Commit your changes and push your branch to GitHub::
$ git add .
$ git commit -m "Your detailed description of your changes."
$ git push origin name-of-your-bugfix-or-feature
7. Submit a pull request through the GitHub website.
Pull Request Guidelines
-----------------------
Before you submit a pull request, check that it meets these guidelines:
1. The pull request should include tests.
2. If the pull request adds functionality, the docs should be updated. Put
your new functionality into a function with a docstring, and add the
feature to the list in README.rst.
3. The pull request should work for Python 2.6, 2.7, 3.3, 3.4 and 3.5, and for PyPy. Check
https://travis-ci.org/<TBD>/gnpy/pull_requests
and make sure that the tests pass for all supported Python versions.
Tips
----
To run a subset of tests::
$ py.test tests.test_gnpy

View File

@@ -1,8 +0,0 @@
=======
History
=======
0.1.0 (2017-06-29)
------------------
* First release on PyPI.

31
LICENSE
View File

@@ -1,31 +0,0 @@
BSD License
Copyright (c) 2017, <TBD>
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@@ -1,11 +0,0 @@
include AUTHORS.rst
include CONTRIBUTING.rst
include HISTORY.rst
include LICENSE
include README.rst
recursive-include tests *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]
recursive-include docs *.rst conf.py Makefile make.bat *.jpg *.png *.gif

View File

@@ -1,87 +0,0 @@
.PHONY: clean clean-test clean-pyc clean-build docs help
.DEFAULT_GOAL := help
define BROWSER_PYSCRIPT
import os, webbrowser, sys
try:
from urllib import pathname2url
except:
from urllib.request import pathname2url
webbrowser.open("file://" + pathname2url(os.path.abspath(sys.argv[1])))
endef
export BROWSER_PYSCRIPT
define PRINT_HELP_PYSCRIPT
import re, sys
for line in sys.stdin:
match = re.match(r'^([a-zA-Z_-]+):.*?## (.*)$$', line)
if match:
target, help = match.groups()
print("%-20s %s" % (target, help))
endef
export PRINT_HELP_PYSCRIPT
BROWSER := python -c "$$BROWSER_PYSCRIPT"
help:
@python -c "$$PRINT_HELP_PYSCRIPT" < $(MAKEFILE_LIST)
clean: clean-build clean-pyc clean-test ## remove all build, test, coverage and Python artifacts
clean-build: ## remove build artifacts
rm -fr build/
rm -fr dist/
rm -fr .eggs/
find . -name '*.egg-info' -exec rm -fr {} +
find . -name '*.egg' -exec rm -f {} +
clean-pyc: ## remove Python file artifacts
find . -name '*.pyc' -exec rm -f {} +
find . -name '*.pyo' -exec rm -f {} +
find . -name '*~' -exec rm -f {} +
find . -name '__pycache__' -exec rm -fr {} +
clean-test: ## remove test and coverage artifacts
rm -fr .tox/
rm -f .coverage
rm -fr htmlcov/
lint: ## check style with flake8
flake8 gnpy tests
test: ## run tests quickly with the default Python
py.test
test-all: ## run tests on every Python version with tox
tox
coverage: ## check code coverage quickly with the default Python
coverage run --source gnpy -m pytest
coverage report -m
coverage html
$(BROWSER) htmlcov/index.html
docs: ## generate Sphinx HTML documentation, including API docs
rm -f docs/gnpy.rst
rm -f docs/modules.rst
sphinx-apidoc -o docs/ gnpy
$(MAKE) -C docs clean
$(MAKE) -C docs html
$(BROWSER) docs/_build/html/index.html
servedocs: docs ## compile the docs watching for changes
watchmedo shell-command -p '*.rst' -c '$(MAKE) -C docs html' -R -D .
release: clean ## package and upload a release
python setup.py sdist upload
python setup.py bdist_wheel upload
dist: clean ## builds source and wheel package
python setup.py sdist
python setup.py bdist_wheel
ls -l dist
install: clean ## install the package to the active Python's site-packages
python setup.py install

25
core/__init__.py Normal file
View File

@@ -0,0 +1,25 @@
#!/usr/bin/env python3
from networkx import DiGraph
from logging import getLogger
logger = getLogger('gnpy.core')
from . import elements
def network_from_json(json_data):
# NOTE|dutc: we could use the following, but it would tie our data format
# too closely to the graph library
# from networkx import node_link_graph
g = DiGraph()
nodes = {}
for el in json_data['elements']:
el = getattr(elements, el['type'])(el['id'], **el['metadata'])
g.add_node(el)
nodes[el.id] = el
for cx in json_data['connections']:
from_node, to_node = nodes[cx['from_node']], nodes[cx['to_node']]
g.add_edge(from_node, to_node)
return g

42
core/__main__.py Normal file
View File

@@ -0,0 +1,42 @@
from argparse import ArgumentParser
from json import load
from sys import exit
from pathlib import Path
from logging import getLogger, basicConfig, INFO, ERROR, DEBUG
from matplotlib.pyplot import show, axis
from networkx import (draw_networkx_nodes, draw_networkx_edges,
draw_networkx_labels)
from . import network_from_json
from .elements import City, Fiber
logger = getLogger('gnpy.core')
def main(args):
with open(args.filename) as f:
json_data = load(f)
network = network_from_json(json_data)
pos = {n: (n.longitude, n.latitude) for n in network.nodes()}
labels_pos = {n: (long-.5, lat-.5) for n, (long, lat) in pos.items()}
size = [20 if isinstance(n, Fiber) else 80 for n in network.nodes()]
color = ['green' if isinstance(n, City) else 'red' for n in network.nodes()]
labels = {n: n.city if isinstance(n, City) else '' for n in network.nodes()}
draw_networkx_nodes(network, pos=pos, node_size=size, node_color=color)
draw_networkx_edges(network, pos=pos)
draw_networkx_labels(network, pos=labels_pos, labels=labels, font_size=14)
axis('off')
show()
parser = ArgumentParser()
parser.add_argument('filename', nargs='?', type=Path,
default= Path(__file__).parent / '../examples/coronet.conus.json')
parser.add_argument('-v', '--verbose', action='count')
if __name__ == '__main__':
args = parser.parse_args()
level = {1: INFO, 2: DEBUG}.get(args.verbose, ERROR)
logger.setLevel(level)
basicConfig()
exit(main(args))

25
core/elements.py Normal file
View File

@@ -0,0 +1,25 @@
#!/usr/bin/env python
from collections import namedtuple
class City(namedtuple('City', 'id city region latitude longitude')):
def __call__(self, spectral_info):
return spectral_info.copy()
class Fiber:
UNITS = {'m': 1, 'km': 1e3}
def __init__(self, id, length, units, latitude, longitude):
self.id = id
self._length = length
self._units = units
self.latitude = latitude
self.longitude = longitude
def __repr__(self):
return f'{type(self).__name__}(id={self.id}, length={self.length})'
@property
def length(self):
return self._length * self.UNITS[self._units]
def __call__(self, spectral_info):
return spectral_info.copy()

3
docs/.gitignore vendored
View File

@@ -1,3 +0,0 @@
/gnpy.rst
/gnpy.*.rst
/modules.rst

View File

@@ -1,177 +0,0 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/gnpy.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/gnpy.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/gnpy"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/gnpy"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

View File

@@ -1 +0,0 @@
.. include:: ../AUTHORS.rst

View File

@@ -1,275 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# gnpy documentation build configuration file, created by
# sphinx-quickstart on Tue Jul 9 22:26:36 2013.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
# If extensions (or modules to document with autodoc) are in another
# directory, add these directories to sys.path here. If the directory is
# relative to the documentation root, use os.path.abspath to make it
# absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
# Get the project root dir, which is the parent dir of this
cwd = os.getcwd()
project_root = os.path.dirname(cwd)
# Insert the project root dir as the first element in the PYTHONPATH.
# This lets us ensure that the source package is imported, and that its
# version is used.
sys.path.insert(0, project_root)
import gnpy
# -- General configuration ---------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'gnpy'
copyright = u"2017, <TBD>"
# The version info for the project you're documenting, acts as replacement
# for |version| and |release|, also used in various other places throughout
# the built documents.
#
# The short X.Y version.
version = gnpy.__version__
# The full version, including alpha/beta/rc tags.
release = gnpy.__version__
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to
# some non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build']
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
# If true, keep warnings as "system message" paragraphs in the built
# documents.
#keep_warnings = False
# -- Options for HTML output -------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Theme options are theme-specific and customize the look and feel of a
# theme further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as
# html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the
# top of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon
# of the docs. This file should be a Windows icon file (.ico) being
# 16x16 or 32x32 pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets)
# here, relative to this directory. They are copied after the builtin
# static files, so a file named "default.css" will overwrite the builtin
# "default.css".
html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page
# bottom, using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names
# to template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer.
# Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer.
# Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages
# will contain a <link> tag referring to it. The value of this option
# must be the base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'gnpydoc'
# -- Options for LaTeX output ------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass
# [howto/manual]).
latex_documents = [
('index', 'gnpy.tex',
u'gnpy Documentation',
u'<TBD>', 'manual'),
]
# The name of an image file (relative to this directory) to place at
# the top of the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings
# are parts, not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output ------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'gnpy',
u'gnpy Documentation',
[u'<TBD>'], 1)
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output ----------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'gnpy',
u'gnpy Documentation',
u'<TBD>',
'gnpy',
'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False

View File

@@ -1 +0,0 @@
.. include:: ../CONTRIBUTING.rst

View File

@@ -1 +0,0 @@
.. include:: ../HISTORY.rst

View File

@@ -1,22 +0,0 @@
Welcome to gnpy's documentation!
======================================
Contents:
.. toctree::
:maxdepth: 2
readme
installation
usage
modules
contributing
authors
history
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,51 +0,0 @@
.. highlight:: shell
============
Installation
============
Stable release
--------------
To install gnpy, run this command in your terminal:
.. code-block:: console
$ pip install gnpy
This is the preferred method to install gnpy, as it will always install the most recent stable release.
If you don't have `pip`_ installed, this `Python installation guide`_ can guide
you through the process.
.. _pip: https://pip.pypa.io
.. _Python installation guide: http://docs.python-guide.org/en/latest/starting/installation/
From sources
------------
The sources for gnpy can be downloaded from the `Github repo`_.
You can either clone the public repository:
.. code-block:: console
$ git clone git://github.com/<TBD>/gnpy
Or download the `tarball`_:
.. code-block:: console
$ curl -OL https://github.com/<TBD>/gnpy/tarball/master
Once you have a copy of the source, you can install it with:
.. code-block:: console
$ python setup.py install
.. _Github repo: https://github.com/<TBD>/gnpy
.. _tarball: https://github.com/<TBD>/gnpy/tarball/master

View File

@@ -1,242 +0,0 @@
@ECHO OFF
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set BUILDDIR=_build
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. xml to make Docutils-native XML files
echo. pseudoxml to make pseudoxml-XML files for display purposes
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
%SPHINXBUILD% 2> nul
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\gnpy.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\gnpy.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdf" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf
cd %BUILDDIR%/..
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdfja" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf-ja
cd %BUILDDIR%/..
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
if "%1" == "xml" (
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The XML files are in %BUILDDIR%/xml.
goto end
)
if "%1" == "pseudoxml" (
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
goto end
)
:end

View File

@@ -1 +0,0 @@
.. include:: ../README.rst

View File

@@ -1,7 +0,0 @@
=====
Usage
=====
To use gnpy in a project::
import gnpy

Binary file not shown.

103
examples/convert.py Normal file
View File

@@ -0,0 +1,103 @@
#!/usr/bin/env python3
from sys import exit
try:
from xlrd import open_workbook
except ModuleNotFoundError:
exit('Required: `pip install xlrd`')
from argparse import ArgumentParser
from collections import namedtuple, Counter
from json import dumps
Node = namedtuple('Node', 'city state country region latitude longitude')
class Link(namedtuple('Link', 'from_city to_city distance distance_units')):
def __new__(cls, from_city, to_city, distance, distance_units='km'):
return super().__new__(cls, from_city, to_city, distance, distance_units)
def parse_excel(args):
with open_workbook(args.workbook) as wb:
nodes_sheet = wb.sheet_by_name('Nodes')
links_sheet = wb.sheet_by_name('Links')
# sanity check
header = [x.value.strip() for x in nodes_sheet.row(4)]
expected = ['City', 'State', 'Country', 'Region', 'Latitude', 'Longitude']
if header != expected:
raise ValueError(f'Malformed header on Nodes sheet: {header} != {expected}')
nodes = []
for row in all_rows(nodes_sheet, start=5):
nodes.append(Node(*(x.value for x in row)))
# sanity check
header = [x.value.strip() for x in links_sheet.row(4)]
expected = ['Node A', 'Node Z', 'Distance (km)']
if header != expected:
raise ValueError(f'Malformed header on Nodes sheet: {header} != {expected}')
links = []
for row in all_rows(links_sheet, start=5):
links.append(Link(*(x.value for x in row)))
# sanity check
all_cities = Counter(n.city for n in nodes)
if len(all_cities) != len(nodes):
ValueError(f'Duplicate city: {all_cities}')
if any(ln.from_city not in all_cities or
ln.to_city not in all_cities for ln in links):
ValueError(f'Bad link.')
return nodes, links
parser = ArgumentParser()
parser.add_argument('workbook', nargs='?', default='CORONET_Global_Topology.xls')
parser.add_argument('-f', '--filter-region', action='append', default=[])
all_rows = lambda sh, start=0: (sh.row(x) for x in range(start, sh.nrows))
def midpoint(city_a, city_b):
lats = city_a.latitude, city_b.latitude
longs = city_a.longitude, city_b.longitude
return {
'latitude': sum(lats) / 2,
'longitude': sum(longs) / 2,
}
if __name__ == '__main__':
args = parser.parse_args()
nodes, links = parse_excel(args)
if args.filter_region:
nodes = [n for n in nodes if n.region in args.filter_region]
cities = {n.city for n in nodes}
links = [lnk for lnk in links if lnk.from_city in cities and
lnk.to_city in cities]
cities = {lnk.from_city for lnk in links} | {lnk.to_city for lnk in links}
nodes = [n for n in nodes if n.city in cities]
nodes_by_city = {n.city: n for n in nodes}
data = {
'elements':
[{'id': x.city,
'metadata': {'city': x.city, 'region': x.region,
'latitude': x.latitude,
'longitude': x.longitude},
'type': 'City'}
for x in nodes] +
[{'id': f'fiber ({x.from_city}{x.to_city})',
'metadata': {'length': x.distance, 'units': x.distance_units,
**midpoint(nodes_by_city[x.from_city],
nodes_by_city[x.to_city])},
'type': 'Fiber'}
for x in links],
'connections':
[{'from_node': x.from_city,
'to_node': f'fiber ({x.from_city}{x.to_city})'}
for x in links] +
[{'from_node': f'fiber ({x.from_city}{x.to_city})',
'to_node': x.to_city}
for x in links]
}
print(dumps(data, indent=2))

378
examples/coronet.asia.json Normal file
View File

@@ -0,0 +1,378 @@
{
"elements": [
{
"id": "Bangkok",
"metadata": {
"city": "Bangkok",
"region": "Asia",
"latitude": 13.73,
"longitude": 100.5
},
"type": "City"
},
{
"id": "Beijing",
"metadata": {
"city": "Beijing",
"region": "Asia",
"latitude": 39.92999979,
"longitude": 116.4000013
},
"type": "City"
},
{
"id": "Delhi",
"metadata": {
"city": "Delhi",
"region": "Asia",
"latitude": 28.6700003,
"longitude": 77.2099989
},
"type": "City"
},
{
"id": "Hong_Kong",
"metadata": {
"city": "Hong_Kong",
"region": "Asia",
"latitude": 22.267,
"longitude": 114.14
},
"type": "City"
},
{
"id": "Honolulu",
"metadata": {
"city": "Honolulu",
"region": "Asia",
"latitude": 21.3199996,
"longitude": -157.800003
},
"type": "City"
},
{
"id": "Mumbai",
"metadata": {
"city": "Mumbai",
"region": "Asia",
"latitude": 18.9599987,
"longitude": 72.8199999
},
"type": "City"
},
{
"id": "Seoul",
"metadata": {
"city": "Seoul",
"region": "Asia",
"latitude": 37.56000108,
"longitude": 126.9899988
},
"type": "City"
},
{
"id": "Shanghai",
"metadata": {
"city": "Shanghai",
"region": "Asia",
"latitude": 31.23,
"longitude": 121.47
},
"type": "City"
},
{
"id": "Singapore",
"metadata": {
"city": "Singapore",
"region": "Asia",
"latitude": 1.299999907,
"longitude": 103.8499992
},
"type": "City"
},
{
"id": "Sydney",
"metadata": {
"city": "Sydney",
"region": "Asia",
"latitude": -33.86999896,
"longitude": 151.2100066
},
"type": "City"
},
{
"id": "Taipei",
"metadata": {
"city": "Taipei",
"region": "Asia",
"latitude": 25.0200005,
"longitude": 121.449997
},
"type": "City"
},
{
"id": "Tokyo",
"metadata": {
"city": "Tokyo",
"region": "Asia",
"latitude": 35.6699986,
"longitude": 139.770004
},
"type": "City"
},
{
"id": "fiber (Bangkok \u2192 Delhi)",
"metadata": {
"length": 3505.949664416427,
"units": "km",
"latitude": 21.20000015,
"longitude": 88.85499945000001
},
"type": "Fiber"
},
{
"id": "fiber (Bangkok \u2192 Hong_Kong)",
"metadata": {
"length": 2070.724162058727,
"units": "km",
"latitude": 17.9985,
"longitude": 107.32
},
"type": "Fiber"
},
{
"id": "fiber (Beijing \u2192 Seoul)",
"metadata": {
"length": 1146.1242170685186,
"units": "km",
"latitude": 38.745000434999994,
"longitude": 121.69500005
},
"type": "Fiber"
},
{
"id": "fiber (Beijing \u2192 Shanghai)",
"metadata": {
"length": 1284.46539141146,
"units": "km",
"latitude": 35.579999895,
"longitude": 118.93500065
},
"type": "Fiber"
},
{
"id": "fiber (Delhi \u2192 Mumbai)",
"metadata": {
"length": 1402.1410424889511,
"units": "km",
"latitude": 23.8149995,
"longitude": 75.0149994
},
"type": "Fiber"
},
{
"id": "fiber (Hong_Kong \u2192 Shanghai)",
"metadata": {
"length": 1480.405514673738,
"units": "km",
"latitude": 26.7485,
"longitude": 117.805
},
"type": "Fiber"
},
{
"id": "fiber (Hong_Kong \u2192 Sydney)",
"metadata": {
"length": 8856.6,
"units": "km",
"latitude": -5.801499479999999,
"longitude": 132.67500330000001
},
"type": "Fiber"
},
{
"id": "fiber (Hong_Kong \u2192 Taipei)",
"metadata": {
"length": 966.1766738801513,
"units": "km",
"latitude": 23.64350025,
"longitude": 117.79499849999999
},
"type": "Fiber"
},
{
"id": "fiber (Honolulu \u2192 Sydney)",
"metadata": {
"length": 9808.61585417977,
"units": "km",
"latitude": -6.274999679999999,
"longitude": -3.294998199999995
},
"type": "Fiber"
},
{
"id": "fiber (Honolulu \u2192 Taipei)",
"metadata": {
"length": 9767.012902360886,
"units": "km",
"latitude": 23.17000005,
"longitude": -18.175003000000004
},
"type": "Fiber"
},
{
"id": "fiber (Mumbai \u2192 Singapore)",
"metadata": {
"length": 4692.7080485536935,
"units": "km",
"latitude": 10.1299993035,
"longitude": 88.33499954999999
},
"type": "Fiber"
},
{
"id": "fiber (Seoul \u2192 Tokyo)",
"metadata": {
"length": 1391.0845320188098,
"units": "km",
"latitude": 36.614999839999996,
"longitude": 133.3800014
},
"type": "Fiber"
},
{
"id": "fiber (Singapore \u2192 Sydney)",
"metadata": {
"length": 7562.33052211171,
"units": "km",
"latitude": -16.2849995265,
"longitude": 127.5300029
},
"type": "Fiber"
},
{
"id": "fiber (Taipei \u2192 Tokyo)",
"metadata": {
"length": 2537.3446021508994,
"units": "km",
"latitude": 30.344999549999997,
"longitude": 130.6100005
},
"type": "Fiber"
}
],
"connections": [
{
"from_node": "Bangkok",
"to_node": "fiber (Bangkok \u2192 Delhi)"
},
{
"from_node": "Bangkok",
"to_node": "fiber (Bangkok \u2192 Hong_Kong)"
},
{
"from_node": "Beijing",
"to_node": "fiber (Beijing \u2192 Seoul)"
},
{
"from_node": "Beijing",
"to_node": "fiber (Beijing \u2192 Shanghai)"
},
{
"from_node": "Delhi",
"to_node": "fiber (Delhi \u2192 Mumbai)"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Shanghai)"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Sydney)"
},
{
"from_node": "Hong_Kong",
"to_node": "fiber (Hong_Kong \u2192 Taipei)"
},
{
"from_node": "Honolulu",
"to_node": "fiber (Honolulu \u2192 Sydney)"
},
{
"from_node": "Honolulu",
"to_node": "fiber (Honolulu \u2192 Taipei)"
},
{
"from_node": "Mumbai",
"to_node": "fiber (Mumbai \u2192 Singapore)"
},
{
"from_node": "Seoul",
"to_node": "fiber (Seoul \u2192 Tokyo)"
},
{
"from_node": "Singapore",
"to_node": "fiber (Singapore \u2192 Sydney)"
},
{
"from_node": "Taipei",
"to_node": "fiber (Taipei \u2192 Tokyo)"
},
{
"from_node": "fiber (Bangkok \u2192 Delhi)",
"to_node": "Delhi"
},
{
"from_node": "fiber (Bangkok \u2192 Hong_Kong)",
"to_node": "Hong_Kong"
},
{
"from_node": "fiber (Beijing \u2192 Seoul)",
"to_node": "Seoul"
},
{
"from_node": "fiber (Beijing \u2192 Shanghai)",
"to_node": "Shanghai"
},
{
"from_node": "fiber (Delhi \u2192 Mumbai)",
"to_node": "Mumbai"
},
{
"from_node": "fiber (Hong_Kong \u2192 Shanghai)",
"to_node": "Shanghai"
},
{
"from_node": "fiber (Hong_Kong \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "fiber (Hong_Kong \u2192 Taipei)",
"to_node": "Taipei"
},
{
"from_node": "fiber (Honolulu \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "fiber (Honolulu \u2192 Taipei)",
"to_node": "Taipei"
},
{
"from_node": "fiber (Mumbai \u2192 Singapore)",
"to_node": "Singapore"
},
{
"from_node": "fiber (Seoul \u2192 Tokyo)",
"to_node": "Tokyo"
},
{
"from_node": "fiber (Singapore \u2192 Sydney)",
"to_node": "Sydney"
},
{
"from_node": "fiber (Taipei \u2192 Tokyo)",
"to_node": "Tokyo"
}
]
}

2538
examples/coronet.conus.json Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,406 @@
{
"elements": [
{
"id": "Amsterdam",
"metadata": {
"city": "Amsterdam",
"region": "Europe",
"latitude": 52.3699996,
"longitude": 4.88999915
},
"type": "City"
},
{
"id": "Berlin",
"metadata": {
"city": "Berlin",
"region": "Europe",
"latitude": 52.520002,
"longitude": 13.379995
},
"type": "City"
},
{
"id": "Brussels",
"metadata": {
"city": "Brussels",
"region": "Europe",
"latitude": 50.830002,
"longitude": 4.330002
},
"type": "City"
},
{
"id": "Bucharest",
"metadata": {
"city": "Bucharest",
"region": "Europe",
"latitude": 44.44,
"longitude": 26.1
},
"type": "City"
},
{
"id": "Frankfurt",
"metadata": {
"city": "Frankfurt",
"region": "Europe",
"latitude": 50.1199992,
"longitude": 8.68000104
},
"type": "City"
},
{
"id": "Istanbul",
"metadata": {
"city": "Istanbul",
"region": "Europe",
"latitude": 41.1,
"longitude": 29.0
},
"type": "City"
},
{
"id": "London",
"metadata": {
"city": "London",
"region": "Europe",
"latitude": 51.5200005,
"longitude": -0.100000296
},
"type": "City"
},
{
"id": "Madrid",
"metadata": {
"city": "Madrid",
"region": "Europe",
"latitude": 40.419998,
"longitude": -3.7100002
},
"type": "City"
},
{
"id": "Paris",
"metadata": {
"city": "Paris",
"region": "Europe",
"latitude": 48.86,
"longitude": 2.3399995
},
"type": "City"
},
{
"id": "Rome",
"metadata": {
"city": "Rome",
"region": "Europe",
"latitude": 41.8899996,
"longitude": 12.5000004
},
"type": "City"
},
{
"id": "Vienna",
"metadata": {
"city": "Vienna",
"region": "Europe",
"latitude": 48.2200024,
"longitude": 16.3700005
},
"type": "City"
},
{
"id": "Warsaw",
"metadata": {
"city": "Warsaw",
"region": "Europe",
"latitude": 52.2599987,
"longitude": 21.0200005
},
"type": "City"
},
{
"id": "Zurich",
"metadata": {
"city": "Zurich",
"region": "Europe",
"latitude": 47.3800015,
"longitude": 8.5399996
},
"type": "City"
},
{
"id": "fiber (Amsterdam \u2192 Berlin)",
"metadata": {
"length": 690.6082920593449,
"units": "km",
"latitude": 52.4450008,
"longitude": 9.134997075
},
"type": "Fiber"
},
{
"id": "fiber (Amsterdam \u2192 Brussels)",
"metadata": {
"length": 210.72879668411468,
"units": "km",
"latitude": 51.600000800000004,
"longitude": 4.610000575000001
},
"type": "Fiber"
},
{
"id": "fiber (Amsterdam \u2192 Frankfurt)",
"metadata": {
"length": 436.32424081216897,
"units": "km",
"latitude": 51.2449994,
"longitude": 6.785000095000001
},
"type": "Fiber"
},
{
"id": "fiber (Berlin \u2192 Warsaw)",
"metadata": {
"length": 623.0146935618342,
"units": "km",
"latitude": 52.390000349999994,
"longitude": 17.199997749999998
},
"type": "Fiber"
},
{
"id": "fiber (Brussels \u2192 London)",
"metadata": {
"length": 381.913012710562,
"units": "km",
"latitude": 51.17500125,
"longitude": 2.115000852
},
"type": "Fiber"
},
{
"id": "fiber (Bucharest \u2192 Istanbul)",
"metadata": {
"length": 528.5804934964391,
"units": "km",
"latitude": 42.769999999999996,
"longitude": 27.55
},
"type": "Fiber"
},
{
"id": "fiber (Bucharest \u2192 Warsaw)",
"metadata": {
"length": 1136.2004559222928,
"units": "km",
"latitude": 48.34999935,
"longitude": 23.56000025
},
"type": "Fiber"
},
{
"id": "fiber (Frankfurt \u2192 Vienna)",
"metadata": {
"length": 717.0013849336048,
"units": "km",
"latitude": 49.1700008,
"longitude": 12.52500077
},
"type": "Fiber"
},
{
"id": "fiber (Istanbul \u2192 Rome)",
"metadata": {
"length": 1650.405833597658,
"units": "km",
"latitude": 41.4949998,
"longitude": 20.7500002
},
"type": "Fiber"
},
{
"id": "fiber (London \u2192 Paris)",
"metadata": {
"length": 411.69237336349147,
"units": "km",
"latitude": 50.19000025,
"longitude": 1.1199996019999998
},
"type": "Fiber"
},
{
"id": "fiber (Madrid \u2192 Paris)",
"metadata": {
"length": 1263.6192323447242,
"units": "km",
"latitude": 44.639999,
"longitude": -0.6850003500000001
},
"type": "Fiber"
},
{
"id": "fiber (Madrid \u2192 Zurich)",
"metadata": {
"length": 1497.3583126093179,
"units": "km",
"latitude": 43.89999975,
"longitude": 2.4149997
},
"type": "Fiber"
},
{
"id": "fiber (Rome \u2192 Vienna)",
"metadata": {
"length": 920.025605583882,
"units": "km",
"latitude": 45.055001000000004,
"longitude": 14.43500045
},
"type": "Fiber"
},
{
"id": "fiber (Rome \u2192 Zurich)",
"metadata": {
"length": 823.399678170238,
"units": "km",
"latitude": 44.63500055,
"longitude": 10.52
},
"type": "Fiber"
},
{
"id": "fiber (Vienna \u2192 Warsaw)",
"metadata": {
"length": 669.2971801468174,
"units": "km",
"latitude": 50.24000055,
"longitude": 18.6950005
},
"type": "Fiber"
}
],
"connections": [
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Berlin)"
},
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Brussels)"
},
{
"from_node": "Amsterdam",
"to_node": "fiber (Amsterdam \u2192 Frankfurt)"
},
{
"from_node": "Berlin",
"to_node": "fiber (Berlin \u2192 Warsaw)"
},
{
"from_node": "Brussels",
"to_node": "fiber (Brussels \u2192 London)"
},
{
"from_node": "Bucharest",
"to_node": "fiber (Bucharest \u2192 Istanbul)"
},
{
"from_node": "Bucharest",
"to_node": "fiber (Bucharest \u2192 Warsaw)"
},
{
"from_node": "Frankfurt",
"to_node": "fiber (Frankfurt \u2192 Vienna)"
},
{
"from_node": "Istanbul",
"to_node": "fiber (Istanbul \u2192 Rome)"
},
{
"from_node": "London",
"to_node": "fiber (London \u2192 Paris)"
},
{
"from_node": "Madrid",
"to_node": "fiber (Madrid \u2192 Paris)"
},
{
"from_node": "Madrid",
"to_node": "fiber (Madrid \u2192 Zurich)"
},
{
"from_node": "Rome",
"to_node": "fiber (Rome \u2192 Vienna)"
},
{
"from_node": "Rome",
"to_node": "fiber (Rome \u2192 Zurich)"
},
{
"from_node": "Vienna",
"to_node": "fiber (Vienna \u2192 Warsaw)"
},
{
"from_node": "fiber (Amsterdam \u2192 Berlin)",
"to_node": "Berlin"
},
{
"from_node": "fiber (Amsterdam \u2192 Brussels)",
"to_node": "Brussels"
},
{
"from_node": "fiber (Amsterdam \u2192 Frankfurt)",
"to_node": "Frankfurt"
},
{
"from_node": "fiber (Berlin \u2192 Warsaw)",
"to_node": "Warsaw"
},
{
"from_node": "fiber (Brussels \u2192 London)",
"to_node": "London"
},
{
"from_node": "fiber (Bucharest \u2192 Istanbul)",
"to_node": "Istanbul"
},
{
"from_node": "fiber (Bucharest \u2192 Warsaw)",
"to_node": "Warsaw"
},
{
"from_node": "fiber (Frankfurt \u2192 Vienna)",
"to_node": "Vienna"
},
{
"from_node": "fiber (Istanbul \u2192 Rome)",
"to_node": "Rome"
},
{
"from_node": "fiber (London \u2192 Paris)",
"to_node": "Paris"
},
{
"from_node": "fiber (Madrid \u2192 Paris)",
"to_node": "Paris"
},
{
"from_node": "fiber (Madrid \u2192 Zurich)",
"to_node": "Zurich"
},
{
"from_node": "fiber (Rome \u2192 Vienna)",
"to_node": "Vienna"
},
{
"from_node": "fiber (Rome \u2192 Zurich)",
"to_node": "Zurich"
},
{
"from_node": "fiber (Vienna \u2192 Warsaw)",
"to_node": "Warsaw"
}
]
}

3454
examples/coronet.json Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +0,0 @@
# -*- coding: utf-8 -*-
from .gnpy import (raised_cosine_comb, analytic_formula, compute_psi, fwm_eff,
get_f_computed_interp, get_freqarray, gn_analytic, gn_model,
interpolate_in_range, GN_integral)
__all__ = ['gnpy']

View File

@@ -1,64 +0,0 @@
import gnpy as gn
import numpy as np
import matplotlib.pyplot as plt
import time
def main():
# Accuracy parameters
flag_analytic = False
num_computed_values = 2
interp_method = 'linear'
threshold_fwm = 50
n_points = 500
n_points_min = 4
accuracy_param = {'is_analytic': flag_analytic, 'n_not_interp': num_computed_values, 'kind_interp': interp_method,
'th_fwm': threshold_fwm, 'n_points': n_points, 'n_points_min': n_points_min}
# Parallelization Parameters
n_cores = 1
# Spectrum parameters
num_ch = 95
rs = np.ones(num_ch) * 0.032
roll_off = np.ones(num_ch) * 0.05
power = np.ones(num_ch) * 0.001
if num_ch % 2 == 1: # odd number of channels
fch = np.arange(-(num_ch // 2), (num_ch // 2) + 1, 1) * 0.05 # noqa: E501
else:
fch = (np.arange(0, num_ch) - (num_ch / 2) + 0.5) * 0.05
spectrum_param = {'num_ch': num_ch, 'f_ch': fch, 'rs': rs, 'roll_off': roll_off, 'power': power}
# Fiber Parameters
beta2 = 21.27
l_span = 100
loss = 0.2
gam = 1.27
fiber_param = {'a_db': loss, 'span_length': l_span, 'beta2': beta2, 'gamma': gam}
# Compute the GN model
t = time.time()
nli_cmp, f_nli_cmp, nli_int, f_nli_int = gn.gn_model(spectrum_param, fiber_param, accuracy_param, n_cores) # noqa: E501
print('Elapsed: %s' % (time.time() - t))
# Compute the raised cosine comb
f1_array = np.linspace(np.amin(fch), np.amax(fch), 1e3)
gtx = gn.raised_cosine_comb(f1_array, rs, roll_off, fch, power)
gtx = gtx + 10 ** -6 # To avoid log10 issues.
# Plot the results
plt.figure(1)
plt.plot(f1_array, 10 * np.log10(gtx), '-b', label='WDM comb')
plt.plot(f_nli_cmp, 10 * np.log10(nli_cmp), 'ro', label='GNLI computed')
plt.plot(f_nli_int, 10 * np.log10(nli_int), 'g+', label='GNLI interpolated')
plt.ylabel('PSD [dB(W/THz)]')
plt.xlabel('f [THz]')
plt.legend(loc='upper left')
plt.grid()
plt.draw()
plt.show()
if __name__ == '__main__':
main()

View File

@@ -1,17 +0,0 @@
# -*- coding: utf-8 -*-
"""Console script for gnpy."""
import click
@click.command()
def main(args=None):
"""Console script for gnpy."""
click.echo("Replace this message by putting your code into "
"gnpy.cli.main")
click.echo("See click documentation at http://click.pocoo.org/")
if __name__ == "__main__":
main()

View File

@@ -1,418 +0,0 @@
# -*- coding: utf-8 -*-
"""Top-level package for gnpy."""
__author__ = """<TBD>"""
__email__ = '<TBD>@<TBD>.com'
__version__ = '0.1.0'
import numpy as np
import multiprocessing as mp
import scipy.interpolate as interp
"""
GNPy: a Python 3 implementation of the Gaussian Noise (GN) Model of nonlinear
propagation, developed by the OptCom group, Department of Electronics and
Telecommunications, Politecnico di Torino, Italy
"""
__credits__ = ["Mattia Cantono", "Vittorio Curri", "Alessio Ferrari"]
def raised_cosine_comb(f, rs, roll_off, center_freq, power):
""" Returns an array storing the PSD of a WDM comb of raised cosine shaped
channels at the input frequencies defined in array f
:param f: Array of frequencies in THz
:param rs: Array of Symbol Rates in TBaud. One Symbol rate for each channel
:param roll_off: Array of roll-off factors [0,1). One per channel
:param center_freq: Array of channels central frequencies in THz. One per channel
:param power: Array of channel powers in W. One per channel
:return: PSD of the WDM comb evaluated over f
"""
ts_arr = 1 / rs
passband_arr = (1 - roll_off) / (2 * ts_arr)
stopband_arr = (1 + roll_off) / (2 * ts_arr)
g = power / rs
psd = np.zeros(np.shape(f))
for ind in range(np.size(center_freq)):
f_nch = center_freq[ind]
g_ch = g[ind]
ts = ts_arr[ind]
passband = passband_arr[ind]
stopband = stopband_arr[ind]
ff = np.abs(f - f_nch)
tf = ff - passband
if roll_off[ind] == 0:
psd = np.where(tf <= 0, g_ch, 0.) + psd
else:
psd = g_ch * (np.where(tf <= 0, 1., 0.) + 1 / 2 * (1 + np.cos(np.pi * ts / roll_off[ind] *
tf)) * np.where(tf > 0, 1., 0.) *
np.where(np.abs(ff) <= stopband, 1., 0.)) + psd
return psd
def fwm_eff(a, Lspan, b2, ff):
""" Computes the four-wave mixing efficiency given the fiber characteristics
over a given frequency set ff
:param a: Fiber loss coefficient in 1/km
:param Lspan: Fiber length in km
:param b2: Fiber Dispersion coefficient in ps/THz/km
:param ff: Array of Frequency points in THz
:return: FWM efficiency rho
"""
rho = np.power(np.abs((1 - np.exp(-2 * a * Lspan + 1j * 4 * np.pi * np.pi * b2 * Lspan * ff)) / (
2 * a - 1j * 4 * np.pi * np.pi * b2 * ff)), 2)
return rho
def get_freqarray(f, Bopt, fmax, max_step, f_dense_low, f_dense_up, df_dense):
""" Returns a non-uniformly spaced frequency array useful for fast GN-model.
integration. The frequency array is made of a denser area, sided by two
log-spaced arrays
:param f: Central frequency at which NLI is evaluated in THz
:param Bopt: Total optical bandwidth of the system in THz
:param fmax: Upper limit of the integration domain in THz
:param max_step: Maximum step size for frequency array definition in THz
:param f_dense_low: Lower limit of denser frequency region in THz
:param f_dense_up: Upper limit of denser frequency region in THz
:param df_dense: Step size to be used in the denser frequency region in THz
:return: Non uniformly defined frequency array
"""
f_dense = np.arange(f_dense_low, f_dense_up, df_dense)
k = Bopt / 2 / (Bopt / 2 - max_step) # Compute Step ratio for log-spaced array definition
if f < 0:
Nlog_short = np.ceil(np.log(fmax / np.abs(f_dense_low)) / np.log(k) + 1)
f1_short = -(np.abs(f_dense_low) * np.power(k, np.arange(Nlog_short, 0.0, -1.0) - 1.0))
k = (Bopt / 2 + (np.abs(f_dense_up) - f_dense_low)) / (Bopt / 2 - max_step + (np.abs(f_dense_up) - f_dense_up))
Nlog_long = np.ceil(np.log((fmax + (np.abs(f_dense_up) - f_dense_up)) / abs(f_dense_up)) * 1 / np.log(k) + 1)
f1_long = np.abs(f_dense_up) * np.power(k, (np.arange(1, Nlog_long + 1) - 1)) - (
np.abs(f_dense_up) - f_dense_up)
f1_array = np.concatenate([f1_short, f_dense[1:], f1_long])
else:
Nlog_short = np.ceil(np.log(fmax / np.abs(f_dense_up)) / np.log(k) + 1)
f1_short = f_dense_up * np.power(k, np.arange(1, Nlog_short + 1) - 1)
k = (Bopt / 2 + (abs(f_dense_low) + f_dense_low)) / (Bopt / 2 - max_step + (abs(f_dense_low) + f_dense_low))
Nlog_long = np.ceil(np.log((fmax + (np.abs(f_dense_low) + f_dense_low)) / np.abs(f_dense_low)) / np.log(k) + 1)
f1_long = -(np.abs(f_dense_low) * np.power(k, np.arange(Nlog_long, 0, -1) - 1)) + (
abs(f_dense_low) + f_dense_low)
f1_array = np.concatenate([f1_long, f_dense[1:], f1_short])
return f1_array
def GN_integral(b2, Lspan, a_db, gam, f_ch, rs, roll_off, power, Nch, model_param):
""" GN_integral computes the GN reference formula via smart brute force integration. The Gaussian Noise model is
applied in its incoherent form (phased-array factor =1). The function computes the integral by columns: for each f1,
a non-uniformly spaced f2 array is generated, and the integrand function is computed there. At the end of the loop
on f1, the overall GNLI is computed. Accuracy can be tuned by operating on model_param argument.
:param b2: Fiber dispersion coefficient in ps/THz/km. Scalar
:param Lspan: Fiber Span length in km. Scalar
:param a_db: Fiber loss coeffiecient in dB/km. Scalar
:param gam: Fiber nonlinear coefficient in 1/W/km. Scalar
:param f_ch: Baseband channels center frequencies in THz. Array of size 1xNch
:param rs: Channels' Symbol Rates in TBaud. Array of size 1xNch
:param roll_off: Channels' Roll-off factors [0,1). Array of size 1xNch
:param power: Channels' power values in W. Array of size 1xNch
:param Nch: Number of channels. Scalar
:param model_param: Dictionary with model parameters for accuracy tuning
model_param['min_FWM_inv']: Minimum FWM efficiency value to be considered for high density
integration in dB
model_param['n_grid']: Maximum Number of integration points to be used in each frequency slot of
the spectrum
model_param['n_grid_min']: Minimum Number of integration points to be used in each frequency
slot of the spectrum
model_param['f_array']: Frequencies at which evaluate GNLI, expressed in THz
:return: GNLI: power spectral density in W/THz of the nonlinear interference at frequencies model_param['f_array']
"""
alpha_lin = a_db / 20.0 / np.log10(np.e) # Conversion in linear units 1/km
min_FWM_inv = np.power(10, model_param['min_FWM_inv'] / 10) # Conversion in linear units
n_grid = model_param['n_grid']
n_grid_min = model_param['n_grid_min']
f_array = model_param['f_array']
fmax = (f_ch[-1] - (rs[-1] / 2)) - (f_ch[0] - (rs[0] / 2)) # Get frequency limit
f2eval = np.max(np.diff(f_ch))
Bopt = f2eval * Nch # Overall optical bandwidth [THz]
min_step = f2eval / n_grid # Minimum integration step
max_step = f2eval / n_grid_min # Maximum integration step
f_dense_start = np.abs(
np.sqrt(np.power(alpha_lin, 2) / (4 * np.power(np.pi, 4) * b2 * b2) * (min_FWM_inv - 1)) / f2eval)
f_ind_eval = 0
GNLI = np.full(f_array.size, np.nan) # Pre-allocate results
for f in f_array: # Loop over f
f_dense_low = f - f_dense_start
f_dense_up = f + f_dense_start
if f_dense_low < -fmax:
f_dense_low = -fmax
if f_dense_low == 0.0:
f_dense_low = -min_step
if f_dense_up == 0.0:
f_dense_up = min_step
if f_dense_up > fmax:
f_dense_up = fmax
f_dense_width = np.abs(f_dense_up - f_dense_low)
n_grid_dense = np.ceil(f_dense_width / min_step)
df = f_dense_width / n_grid_dense
# Get non-uniformly spaced f1 array
f1_array = get_freqarray(f, Bopt, fmax, max_step, f_dense_low, f_dense_up, df)
G1 = raised_cosine_comb(f1_array, rs, roll_off, f_ch, power) # Get corresponding spectrum
Gpart = np.zeros(f1_array.size) # Pre-allocate partial result for inner integral
f_ind = 0
for f1 in f1_array: # Loop over f1
if f1 != f:
f_lim = np.sqrt(np.power(alpha_lin, 2) / (4 * np.power(np.pi, 4) * b2 * b2) * (min_FWM_inv - 1)) / (
f1 - f) + f
f2_dense_up = np.maximum(f_lim, -f_lim)
f2_dense_low = np.minimum(f_lim, -f_lim)
if f2_dense_low == 0:
f2_dense_low = -min_step
if f2_dense_up == 0:
f2_dense_up = min_step
if f2_dense_low < -fmax:
f2_dense_low = -fmax
if f2_dense_up > fmax:
f2_dense_up = fmax
else:
f2_dense_up = fmax
f2_dense_low = -fmax
f2_dense_width = np.abs(f2_dense_up - f2_dense_low)
n2_grid_dense = np.ceil(f2_dense_width / min_step)
df2 = f2_dense_width / n2_grid_dense
# Get non-uniformly spaced f2 array
f2_array = get_freqarray(f, Bopt, fmax, max_step, f2_dense_low, f2_dense_up, df2)
f2_array = f2_array[f2_array >= f1] # Do not consider points below the bisector of quadrants I and III
if f2_array.size > 0:
G2 = raised_cosine_comb(f2_array, rs, roll_off, f_ch, power) # Get spectrum there
f3_array = f1 + f2_array - f # Compute f3
G3 = raised_cosine_comb(f3_array, rs, roll_off, f_ch, power) # Get spectrum over f3
G = G2 * G3 * G1[f_ind]
if np.count_nonzero(G):
FWM_eff = fwm_eff(alpha_lin, Lspan, b2, (f1 - f) * (f2_array - f)) # Compute FWM efficiency
Gpart[f_ind] = 2 * np.trapz(FWM_eff * G, f2_array) # Compute inner integral
f_ind += 1
# Compute outer integral. Nominal span loss already compensated
GNLI[f_ind_eval] = 16 / 27 * gam * gam * np.trapz(Gpart, f1_array)
f_ind_eval += 1 # Next frequency
return GNLI # Return GNLI array in W/THz and the array of the corresponding frequencies
def compute_psi(b2, l_eff_a, f_ch, channel_index, interfering_index, rs):
""" compute_psi computes the psi coefficient of the analytical formula.
:param b2: Fiber dispersion coefficient in ps/THz/km. Scalar
:param l_eff_a: Asymptotic effective length in km. Scalar
:param f_ch: Baseband channels center frequencies in THz. Array of size 1xNch
:param channel_index: Index of the channel. Scalar
:param interfering_index: Index of the interfering signal. Scalar
:param rs: Channels' Symbol Rates in TBaud. Array of size 1xNch
:return: psi: the coefficient
"""
b2 = np.abs(b2)
if channel_index == interfering_index: # The signal interfere with itself
rs_sig = rs[channel_index]
psi = np.arcsinh(0.5 * np.pi ** 2 * l_eff_a * b2 * rs_sig ** 2)
else:
f_sig = f_ch[channel_index]
rs_sig = rs[channel_index]
f_int = f_ch[interfering_index]
rs_int = rs[interfering_index]
del_f = f_sig - f_int
psi = np.arcsinh(np.pi ** 2 * l_eff_a * b2 * rs_sig * (del_f + 0.5 * rs_int))
psi -= np.arcsinh(np.pi ** 2 * l_eff_a * b2 * rs_sig * (del_f - 0.5 * rs_int))
return psi
def analytic_formula(ind, b2, l_eff, l_eff_a, gam, f_ch, g_ch, rs, n_ch):
""" analytic_formula computes the analytical formula.
:param ind: index of the channel at which g_nli is computed. Scalar
:param b2: Fiber dispersion coefficient in ps/THz/km. Scalar
:param l_eff: Effective length in km. Scalar
:param l_eff_a: Asymptotic effective length in km. Scalar
:param gam: Fiber nonlinear coefficient in 1/W/km. Scalar
:param f_ch: Baseband channels center frequencies in THz. Array of size 1xNch
:param g_ch: Power spectral density W/THz. Array of size 1xNch
:param rs: Channels' Symbol Rates in TBaud. Array of size 1xNch
:param n_ch: Number of channels. Scalar
:return: g_nli: power spectral density in W/THz of the nonlinear interference
"""
ch_psd = g_ch[ind]
b2 = abs(b2)
g_nli = 0
for n in np.arange(0, n_ch):
psi = compute_psi(b2, l_eff_a, f_ch, ind, n, rs)
g_nli += g_ch[n] * ch_psd ** 2 * psi
g_nli *= (16 / 27) * (gam * l_eff) ** 2 / (2 * np.pi * b2 * l_eff_a)
return g_nli
def gn_analytic(b2, l_span, a_db, gam, f_ch, rs, power, n_ch):
""" gn_analytic computes the GN reference formula via analytical solution.
:param b2: Fiber dispersion coefficient in ps/THz/km. Scalar
:param l_span: Fiber Span length in km. Scalar
:param a_db: Fiber loss coeffiecient in dB/km. Scalar
:param gam: Fiber nonlinear coefficient in 1/W/km. Scalar
:param f_ch: Baseband channels center frequencies in THz. Array of size 1xNch
:param rs: Channels' Symbol Rates in TBaud. Array of size 1xNch
:param power: Channels' power values in W. Array of size 1xNch
:param n_ch: Number of channels. Scalar
:return: g_nli: power spectral density in W/THz of the nonlinear interference at frequencies model_param['f_array']
"""
g_ch = power / rs
alpha_lin = a_db / 20.0 / np.log10(np.e) # Conversion in linear units 1/km
l_eff = (1 - np.exp(-2 * alpha_lin * l_span)) / (2 * alpha_lin) # Effective length
l_eff_a = 1 / (2 * alpha_lin) # Asymptotic effective length
g_nli = np.zeros(f_ch.size)
for ind in np.arange(0, f_ch.size):
g_nli[ind] = analytic_formula(ind, b2, l_eff, l_eff_a, gam, f_ch, g_ch, rs, n_ch)
return g_nli
def get_f_computed_interp(f_ch, n_not_interp):
""" get_f_computed_array returns the arrays containing the frequencies at which g_nli is computed and interpolated.
:param f_ch: the overall frequency array. Array of size 1xnum_ch
:param n_not_interp: the number of points at which g_nli has to be computed
:return: f_nli_comp: the array containing the frequencies at which g_nli is computed
:return: f_nli_interp: the array containing the frequencies at which g_nli is interpolated
"""
num_ch = len(f_ch)
if num_ch < n_not_interp: # It's useless to compute g_nli in a number of points larger than num_ch
n_not_interp = num_ch
# Compute f_nli_comp
n_not_interp_left = np.ceil((n_not_interp - 1) / 2)
n_not_interp_right = np.floor((n_not_interp - 1) / 2)
central_index = len(f_ch) // 2
f_nli_central = np.array([f_ch[central_index]], copy=True)
if n_not_interp_left > 0:
index = np.linspace(0, central_index - 1, n_not_interp_left, dtype='int')
f_nli_left = np.array(f_ch[index], copy=True)
else:
f_nli_left = np.array([])
if n_not_interp_right > 0:
index = np.linspace(-1, -central_index, n_not_interp_right, dtype='int')
f_nli_right = np.array(f_ch[index], copy=True)
f_nli_right = f_nli_right[::-1] # Reverse the order of the array
else:
f_nli_right = np.array([])
f_nli_comp = np.concatenate([f_nli_left, f_nli_central, f_nli_right])
# Compute f_nli_interp
f_ch_sorted = np.sort(f_ch)
index = np.searchsorted(f_ch_sorted, f_nli_comp)
f_nli_interp = np.array(f_ch, copy=True)
f_nli_interp = np.delete(f_nli_interp, index)
return f_nli_comp, f_nli_interp
def interpolate_in_range(x, y, x_new, kind_interp):
""" Given some samples y of the function y(x), interpolate_in_range returns the interpolation of values y(x_new)
:param x: The points at which y(x) is evaluated. Array
:param y: The values of y(x). Array
:param x_new: The values at which y(x) has to be interpolated. Array
:param kind_interp: The interpolation method of the function scipy.interpolate.interp1d. String
:return: y_new: the new interpolates samples
"""
if x.size == 1:
y_new = y * np.ones(x_new.size)
elif x.size == 2:
x = np.append(x, x_new[-1])
y = np.append(y, y[-1])
func = interp.interp1d(x, y, kind=kind_interp, bounds_error=False)
y_new = func(x_new)
else:
func = interp.interp1d(x, y, kind=kind_interp, bounds_error=False)
y_new = func(x_new)
return y_new
def gn_model(spectrum_param, fiber_param, accuracy_param, n_cores):
""" gn_model can compute the gn model both analytically or through the smart brute force
integral.
:param spectrum_param: Dictionary with spectrum parameters
spectrum_param['num_ch']: Number of channels. Scalar
spectrum_param['f_ch']: Baseband channels center frequencies in THz. Array of size 1xnum_ch
spectrum_param['rs']: Channels' Symbol Rates in TBaud. Array of size 1xnum_ch
spectrum_param['roll_off']: Channels' Roll-off factors [0,1). Array of size 1xnum_ch
spectrum_param['power']: Channels' power values in W. Array of size 1xnum_ch
:param fiber_param: Dictionary with the parameters of the fiber
fiber_param['a_db']: Fiber loss coeffiecient in dB/km. Scalar
fiber_param['span_length']: Fiber Span length in km. Scalar
fiber_param['beta2']: Fiber dispersion coefficient in ps/THz/km. Scalar
fiber_param['gamma']: Fiber nonlinear coefficient in 1/W/km. Scalar
:param accuracy_param: Dictionary with model parameters for accuracy tuning
accuracy_param['is_analytic']: A boolean indicating if you want to compute the NLI through
the analytic formula (is_analytic = True) of the smart brute force integration (is_analytic =
False). Boolean
accuracy_param['n_not_interp']: The number of NLI which will be calculated. Others are
interpolated
accuracy_param['kind_interp']: The kind of interpolation using the function
scipy.interpolate.interp1d
accuracy_param['th_fwm']: Minimum FWM efficiency value to be considered for high density
integration in dB
accuracy_param['n_points']: Maximum Number of integration points to be used in each frequency
slot of the spectrum
accuracy_param['n_points_min']: Minimum Number of integration points to be used in each
frequency
slot of the spectrum
:return: g_nli_comp: the NLI power spectral density in W/THz computed through GN model
:return: f_nli_comp: the frequencies at which g_nli_comp is evaluated
:return: g_nli_interp: the NLI power spectral density in W/THz computed through interpolation of g_nli_comp
:return: f_nli_interp: the frequencies at which g_nli_interp is estimated
"""
# Take signal parameters
num_ch = spectrum_param['num_ch']
f_ch = spectrum_param['f_ch']
rs = spectrum_param['rs']
roll_off = spectrum_param['roll_off']
power = spectrum_param['power']
# Take fiber parameters
a_db = fiber_param['a_db']
l_span = fiber_param['span_length']
beta2 = fiber_param['beta2']
gam = fiber_param['gamma']
# Take accuracy parameters
is_analytic = accuracy_param['is_analytic']
n_not_interp = accuracy_param['n_not_interp']
kind_interp = accuracy_param['kind_interp']
th_fwm = accuracy_param['th_fwm']
n_points = accuracy_param['n_points']
n_points_min = accuracy_param['n_points_min']
# Computing NLI
if is_analytic: # Analytic solution
g_nli_comp = gn_analytic(beta2, l_span, a_db, gam, f_ch, rs, power, num_ch)
f_nli_comp = np.copy(f_ch)
g_nli_interp = []
f_nli_interp = []
else: # Smart brute force integration
f_nli_comp, f_nli_interp = get_f_computed_interp(f_ch, n_not_interp)
model_param = {'min_FWM_inv': th_fwm, 'n_grid': n_points, 'n_grid_min': n_points_min,
'f_array': np.array(f_nli_comp, copy=True)}
g_nli_comp = GN_integral(beta2, l_span, a_db, gam, f_ch, rs, roll_off, power, num_ch, model_param)
# Interpolation
g_nli_interp = interpolate_in_range(f_nli_comp, g_nli_comp, f_nli_interp, kind_interp)
return g_nli_comp, f_nli_comp, g_nli_interp, f_nli_interp

2
requirements.txt Normal file
View File

@@ -0,0 +1,2 @@
decorator==4.1.2
networkx==2.0

View File

@@ -1,12 +0,0 @@
pip==8.1.2
bumpversion==0.5.3
wheel==0.29.0
watchdog==0.8.3
flake8==2.6.0
tox==2.3.1
coverage==4.1
Sphinx==1.4.8
cryptography==1.7
PyYAML==3.11
pytest==2.9.2
pytest-runner==2.11.1

View File

@@ -1,22 +0,0 @@
[bumpversion]
current_version = 0.1.0
commit = True
tag = True
[bumpversion:file:setup.py]
search = version='{current_version}'
replace = version='{new_version}'
[bumpversion:file:gnpy/__init__.py]
search = __version__ = '{current_version}'
replace = __version__ = '{new_version}'
[bdist_wheel]
universal = 1
[flake8]
exclude = docs
[aliases]
test = pytest
# Define setup.py command aliases here

View File

@@ -1,66 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""The setup script."""
from setuptools import setup, find_packages
with open('README.rst') as readme_file:
readme = readme_file.read()
with open('HISTORY.rst') as history_file:
history = history_file.read()
requirements = [
'Click>=6.0',
'numpy',
'scipy'
# TODO: put package requirements here
]
setup_requirements = [
'pytest-runner',
# TODO(<TBD>): put setup requirements (distutils extensions, etc.) here
]
test_requirements = [
'pytest',
# TODO: put package test requirements here
]
setup(
name='gnpy',
version='0.1.0',
description="Gaussian Noise (GN) modeling library",
long_description=readme + '\n\n' + history,
author="<TBD>",
author_email='<TBD>@<TBD>.com',
url='https://github.com/Telecominfraproject/gnpy',
packages=find_packages(include=['gnpy']),
entry_points={
'console_scripts': [
'gnpy=gnpy.cli:main'
]
},
include_package_data=True,
install_requires=requirements,
license="BSD license",
zip_safe=False,
keywords='gnpy',
classifiers=[
'Development Status :: 2 - Pre-Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
"Programming Language :: Python :: 2",
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
],
test_suite='tests',
tests_require=test_requirements,
setup_requires=setup_requirements,
)

View File

@@ -1,3 +0,0 @@
# -*- coding: utf-8 -*-
"""Unit test package for gnpy."""

View File

@@ -1,38 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for `gnpy` package."""
import pytest
from click.testing import CliRunner
from gnpy import gnpy
from gnpy import cli
@pytest.fixture
def response():
"""Sample pytest fixture.
See more at: http://doc.pytest.org/en/latest/fixture.html
"""
# import requests
# return requests.get('https://github.com/audreyr/cookiecutter-pypackage')
def test_content(response):
"""Sample pytest test function with the pytest fixture as an argument."""
# from bs4 import BeautifulSoup
# assert 'GitHub' in BeautifulSoup(response.content).title.string
def test_command_line_interface():
"""Test the CLI."""
runner = CliRunner()
result = runner.invoke(cli.main)
assert result.exit_code == 0
assert 'gnpy.cli.main' in result.output
help_result = runner.invoke(cli.main, ['--help'])
assert help_result.exit_code == 0
assert '--help Show this message and exit.' in help_result.output

30
tox.ini
View File

@@ -1,30 +0,0 @@
[tox]
envlist = py26, py27, py33, py34, py35, flake8
[travis]
python =
3.5: py35
3.4: py34
3.3: py33
2.7: py27
2.6: py26
[testenv:flake8]
basepython=python
deps=flake8
commands=flake8 gnpy
[testenv]
setenv =
PYTHONPATH = {toxinidir}
deps =
-r{toxinidir}/requirements_dev.txt
commands =
pip install -U pip
py.test --basetemp={envtmpdir}
; If you want to make tox run the tests with the same versions, create a
; requirements.txt with the pinned versions and uncomment the following lines:
; deps =
; -r{toxinidir}/requirements.txt

View File

@@ -1,127 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Update encrypted deploy password in Travis config file."""
from __future__ import print_function
import base64
import json
import os
from getpass import getpass
import yaml
from cryptography.hazmat.primitives.serialization import load_pem_public_key
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric.padding import PKCS1v15
try:
from urllib import urlopen
except ImportError:
from urllib.request import urlopen
GITHUB_REPO = '<TBD>/gnpy'
TRAVIS_CONFIG_FILE = os.path.join(
os.path.dirname(os.path.abspath(__file__)), '.travis.yml')
def load_key(pubkey):
"""Load public RSA key.
Work around keys with incorrect header/footer format.
Read more about RSA encryption with cryptography:
https://cryptography.io/latest/hazmat/primitives/asymmetric/rsa/
"""
try:
return load_pem_public_key(pubkey.encode(), default_backend())
except ValueError:
# workaround for https://github.com/travis-ci/travis-api/issues/196
pubkey = pubkey.replace('BEGIN RSA', 'BEGIN').replace('END RSA', 'END')
return load_pem_public_key(pubkey.encode(), default_backend())
def encrypt(pubkey, password):
"""Encrypt password using given RSA public key and encode it with base64.
The encrypted password can only be decrypted by someone with the
private key (in this case, only Travis).
"""
key = load_key(pubkey)
encrypted_password = key.encrypt(password, PKCS1v15())
return base64.b64encode(encrypted_password)
def fetch_public_key(repo):
"""Download RSA public key Travis will use for this repo.
Travis API docs: http://docs.travis-ci.com/api/#repository-keys
"""
keyurl = 'https://api.travis-ci.org/repos/{0}/key'.format(repo)
data = json.loads(urlopen(keyurl).read().decode())
if 'key' not in data:
errmsg = "Could not find public key for repo: {}.\n".format(repo)
errmsg += "Have you already added your GitHub repo to Travis?"
raise ValueError(errmsg)
return data['key']
def prepend_line(filepath, line):
"""Rewrite a file adding a line to its beginning."""
with open(filepath) as f:
lines = f.readlines()
lines.insert(0, line)
with open(filepath, 'w') as f:
f.writelines(lines)
def load_yaml_config(filepath):
"""Load yaml config file at the given path."""
with open(filepath) as f:
return yaml.load(f)
def save_yaml_config(filepath, config):
"""Save yaml config file at the given path."""
with open(filepath, 'w') as f:
yaml.dump(config, f, default_flow_style=False)
def update_travis_deploy_password(encrypted_password):
"""Put `encrypted_password` into the deploy section of .travis.yml."""
config = load_yaml_config(TRAVIS_CONFIG_FILE)
config['deploy']['password'] = dict(secure=encrypted_password)
save_yaml_config(TRAVIS_CONFIG_FILE, config)
line = ('# This file was autogenerated and will overwrite'
' each time you run travis_pypi_setup.py\n')
prepend_line(TRAVIS_CONFIG_FILE, line)
def main(args):
"""Add a PyPI password to .travis.yml so that Travis can deploy to PyPI.
Fetch the Travis public key for the repo, and encrypt the PyPI password
with it before adding, so that only Travis can decrypt and use the PyPI
password.
"""
public_key = fetch_public_key(args.repo)
password = args.password or getpass('PyPI password: ')
update_travis_deploy_password(encrypt(public_key, password.encode()))
print("Wrote encrypted password to .travis.yml -- you're ready to deploy")
if '__main__' == __name__:
import argparse
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--repo', default=GITHUB_REPO,
help='GitHub repo (default: %s)' % GITHUB_REPO)
parser.add_argument('--password',
help='PyPI password (will prompt if not provided)')
args = parser.parse_args()
main(args)