1
0
mirror of https://github.com/peeringdb/peeringdb.git synced 2024-05-11 05:55:09 +00:00
Files
peeringdb-peeringdb/peeringdb_server/export_views.py
Matt Griswold ea55c4dc38 July updates (#762)
* Change label from primary ASN to ASN

* Raise validation error when trying to update ASN

* first steps for dotf importer procotol (#697)

* migrations (#697)

* Add translation to error meessage

* Make ASN readonly in table

* Add test now that ASN should not be able to update

* Set fac.rencode to '' for all entries and make it readonly in serializer

* Add unique constraints to network ixlan ip addresses

* Add migration to null out duplicate ipaddresses for deleted netixlans

* Add unique constraints to network ixlan ip addresses

* Add migration to null out duplicate ipaddresses for deleted netixlans

* remove old migrations (#697)

* fix netixlan ipaddr dedupe migration (#268)
add netixlan ipaddr unique constraint migration (#268)

* ixf_member_data migrations (#697)

* fix table name (#697)

* importer protocol (#697)

* fix netixlan ipaddr dedupe migration (#268)
add netixlan ipaddr unique constraint migration (#268)

* ixf proposed changes notifications (#697)

* Delete repeated query

* Add a test to show rencode is readonly

* Blank out rencode when mocking data

* Remove validator now that constraint exists

* Add back unique field validator w Check Deleted true

* conflict resolving (#697)

* UniqueFieldValidator raise error with code "unique" (#268)

* conflict resolution (#697)

* Add fixme comment to tests

* conflict resolution (#697)

* Remove now invalid undelete tests

* UniqueFieldValidator raise error with code "unique" (#268)

* delete admin tools for duplicate ip addresses

* Make migration to delete duplicateipnetworkixlan

* Add ixlan-ixpfx status matching validation, add corresponding test

* delete redundant checking in test

* resolve conflict ui (#697)

* fix migrations hierarchy

* squash migrations for ixf member data

* clean up preview and post-mortem tools

* remove non-sensical permission check when undeleting soft-deleted objects through unique integrity error handling

* only include the ix-f data url in notifications to admincom (#697)

* resolve on --skip-import (#697)

* ac conflict resolution (#697)

* Define more accurately the incompatible statuses for ixlan and ixpfx

* Add another status test

* Preventing disrupting changes (#697)

* fix tests (#697)

* Stop allow_ixp_update from being write only and add a global stat for automated networks

* Add tests for global stats that appear in footer

* Change how timezone is called with datetime, to get test_stats.py/test_generate_for_current_date to pass

* test for protected entities (#697)

* admincom conflict resolution refine readonly fields (#697)
network notifications only if the problem is actually actionable by the network (#697)

* ixp / ac notifcation when ix-f source cannot be parsed (#697)
fix issue with ixlan prefix protection (#697)

* migrations (#697)

* code documentation (#697)

* ux tweaks (#697)

* UX tweaks (#697)

* Fix typo

* fix netixlan returned in IXFMemberData.apply when adding a new one (#697)

* fix import log incosistencies (#697)

* Add IXFMemberData to test

* Update test data

* Add protocol tests

* Add tests for views

* always persist changes to remote data on set_conflict (#697)

* More tests

* always persist changes to remote data on set_conflict (#697)

* suggest-add test

* net_present_at_ix should check status (#697)

* Add more protocol tests

* Edit language of some tests

* django-peeringdb to 2.1.1
relock pipfile, pin django-ratelimit to <3 as it breaks stuff

* Add net_count_ixf field to ix object (#683)

* Add the IX-F Member Export URL to the ixlan API endpoint (#249)

* Lock some objects from being deleted by the owner (#696)

* regenerate api docs (#249)

* always persist changes to remote data on set_add and set_update (#697)

* IXFMemberData: always persist remote data changes during set_add and set_update, also allow for saving without touching the updated field

* always persist changes to remote data on set_add and set_update (#697)

* Fix suggest-add tests

* IXFMemberData: always persist remote data changes during set_add and set_update, also allow for saving without touching the updated field

* IXFMemberData: always persist remote data changes during set_add and set_update, also allow for saving without touching the updated field

* fix issue with deletion when ixfmemberdata for entry existed previously (#697)

* fix test_suggest_delete_local_ixf_no_flag (#697 tests)

* fix issue with deletion when ixfmemberdata for entry existed previously (#697)

* invalid ips get logged and notified to the ix via notify_error (#697)

* Fix more tests

* issue with previous_data when running without save (#697)
properly track speed errors (#697)

* reset errors on ixfmemberdata that go into pending_save (#697)

* add remote_data to admin view (#697)

* fix error reset inconsistency (#697)

* Refine invalid data tests

* remove debug output

* for notifications to ac include contact points for net and ix in the message (#697)

* settings to toggle ix-f tickets / emails (#697)

* allow turning off ix-f notifications for net and ix separately (#697)

* add jsonschema test

* Add idempotent tests to updater

* remove old ixf member tests

* Invalid data tests when ixp_updates are enabled

* fix speed error validation (#697)

* fix issue with rollback (#697)

* fix migration hierarchy

* fix ixfmemberdata _email

* django-peeringdb to 2.2 and relock

* add ixf rollback tests

* ixf email notifications off by default

* black formatted

* pyupgrade

Co-authored-by: egfrank <egfrank@20c.com>
Co-authored-by: Stefan Pratter <stefan@20c.com>
2020-07-15 07:07:01 +00:00

401 lines
12 KiB
Python

import json
import datetime
import urllib.request, urllib.parse, urllib.error
import csv
import io
import collections
from django.http import JsonResponse, HttpResponse
from django.views import View
from django.utils.translation import ugettext_lazy as _
from rest_framework.test import APIRequestFactory
from peeringdb_server.models import IXLan, NetworkIXLan, InternetExchange
from peeringdb_server.rest import REFTAG_MAP as RestViewSets
from peeringdb_server.renderers import JSONEncoder
def export_ixf_ix_members(ixlans, pretty=False):
member_list = []
ixp_list = []
for ixlan in ixlans:
if ixlan.ix not in ixp_list:
ixp_list.append(ixlan.ix)
rv = {
"version": "0.6",
"timestamp": datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%SZ"),
"member_list": member_list,
"ixp_list": [{"ixp_id": ixp.id, "shortname": ixp.name} for ixp in ixp_list],
}
for ixlan in ixlans:
asns = []
for netixlan in ixlan.netixlan_set_active.all():
if netixlan.asn in asns:
continue
connection_list = []
member = {
"asnum": netixlan.asn,
"member_type": "peering",
"name": netixlan.network.name,
"url": netixlan.network.website,
"contact_email": [
poc.email
for poc in netixlan.network.poc_set_active.filter(visible="Public")
],
"contact_phone": [
poc.phone
for poc in netixlan.network.poc_set_active.filter(visible="Public")
],
"peering_policy": netixlan.network.policy_general.lower(),
"peering_policy_url": netixlan.network.policy_url,
"connection_list": connection_list,
}
member_list.append(member)
asns.append(netixlan.asn)
for _netixlan in ixlan.netixlan_set_active.filter(asn=netixlan.asn):
vlan_list = [{}]
connection = {
"ixp_id": _netixlan.ixlan.ix_id,
"state": "active",
"if_list": [{"if_speed": _netixlan.speed}],
"vlan_list": vlan_list,
}
connection_list.append(connection)
if _netixlan.ipaddr4:
vlan_list[0]["ipv4"] = {
"address": f"{_netixlan.ipaddr4}",
"routeserver": _netixlan.is_rs_peer,
"max_prefix": _netixlan.network.info_prefixes4,
"as_macro": _netixlan.network.irr_as_set,
}
if _netixlan.ipaddr6:
vlan_list[0]["ipv6"] = {
"address": f"{_netixlan.ipaddr6}",
"routeserver": _netixlan.is_rs_peer,
"max_prefix": _netixlan.network.info_prefixes6,
"as_macro": _netixlan.network.irr_as_set,
}
if pretty:
return json.dumps(rv, indent=2)
else:
return json.dumps(rv)
def view_export_ixf_ix_members(request, ix_id):
return HttpResponse(
export_ixf_ix_members(
IXLan.objects.filter(ix_id=ix_id, status="ok"),
pretty="pretty" in request.GET,
),
content_type="application/json",
)
def view_export_ixf_ixlan_members(request, ixlan_id):
return HttpResponse(
export_ixf_ix_members(
IXLan.objects.filter(id=ixlan_id, status="ok"),
pretty="pretty" in request.GET,
),
content_type="application/json",
)
class ExportView(View):
"""
Base class for more complex data exports
"""
# supported export fortmats
formats = ["json", "json_pretty", "csv"]
# when exporting json data, if this is it not None
# json data will be wrapped in one additional dict
# and referenced at a key with the specified name
json_root_key = "data"
# exporting data should send file attachment headers
download = True
# if download=True this value will be used to specify
# the filename of the downloaded file
download_name = "export.{extension}"
# format to file extension translation table
extensions = {"csv": "csv", "json": "json", "json_pretty": "json"}
def get(self, request, fmt):
fmt = fmt.replace("-", "_")
if fmt not in self.formats:
raise ValueError(_("Invalid export format"))
try:
response_handler = getattr(self, f"response_{fmt}")
response = response_handler(self.generate(request))
if self.download == True:
# send attachment header, triggering download on the client side
filename = self.download_name.format(extension=self.extensions.get(fmt))
response["Content-Disposition"] = 'attachment; filename="{}"'.format(
filename
)
return response
except Exception as exc:
return JsonResponse({"non_field_errors": [str(exc)]}, status=400)
def generate(self, request):
"""
Function that generates export data from request
Override this
"""
return {}
def response_json(self, data):
"""
Return Response object for normal json response
Arguments:
- data <list|dict>: serializable data, if list is passed you will need
to specify a value in self.json_root_key
Returns:
- JsonResponse
"""
if self.json_root_key:
data = {self.json_root_key: data}
return JsonResponse(data, encoder=JSONEncoder)
def response_json_pretty(self, data):
"""
Returns Response object for pretty (indented) json response
Arguments:
- data <list|dict>: serializable data, if list is passed tou will need
to specify a value in self.json_root_key
Returns:
- HttpResponse: http response with appropriate json headers, cannot use
JsonResponse here because we need to specify indent level
"""
if self.json_root_key:
data = {self.json_root_key: data}
return HttpResponse(
json.dumps(data, indent=2, cls=JSONEncoder), content_type="application/json"
)
def response_csv(self, data):
"""
Returns Response object for CSV response
Arguments:
- data <list>
Returns:
- HttpResponse
"""
if not data:
return ""
response = HttpResponse(content_type="text/csv")
csv_writer = csv.DictWriter(response, fieldnames=list(data[0].keys()))
csv_writer.writeheader()
for row in data:
for k, v in list(row.items()):
if isinstance(v, str):
row[k] = f"{v}"
csv_writer.writerow(row)
return response
class AdvancedSearchExportView(ExportView):
"""
Allows exporting of advanced search result data
"""
tag = None
json_root_key = "results"
download_name = "advanced_search_export.{extension}"
def fetch(self, request):
"""
Fetch data from api according to GET parameters
Note that `limit` and `depth` will be overwritten, other api
parameters will be passed along as-is
Returns:
- dict: un-rendered dataset returned by api
"""
params = request.GET.dict()
params["limit"] = 250
params["depth"] = 1
# prepare api request
request_factory = APIRequestFactory()
viewset = RestViewSets[self.tag].as_view({"get": "list"})
api_request = request_factory.get(
"/api/{}/?{}".format(self.tag, urllib.parse.urlencode(params))
)
# we want to use the same user as the original request
# so permissions are applied correctly
api_request.user = request.user
response = viewset(api_request)
return response.data
def get(self, request, tag, fmt):
"""
Handle export
"""
self.tag = tag
return super().get(request, fmt)
def generate(self, request):
"""
Generate data for the reftag specified in self.tag
This functions will call generate_<tag> and return the result
Arguments:
- request <Request>
Returns:
- list: list containing rendered data rows ready for export
"""
if self.tag not in ["net", "ix", "fac", "org"]:
raise ValueError(_("Invalid tag"))
data_function = getattr(self, f"generate_{self.tag}")
return data_function(request)
def generate_net(self, request):
"""
Fetch network data from the api according to request and then render
it ready for export
Arguments:
- request <Request>
Returns:
- list: list containing rendered data ready for export
"""
data = self.fetch(request)
download_data = []
for row in data:
download_data.append(
collections.OrderedDict(
[
("Name", row["name"]),
("Also known as", row["aka"]),
("ASN", row["asn"]),
("General Policy", row["policy_general"]),
("Network Type", row["info_type"]),
("Network Scope", row["info_scope"]),
("Traffic Levels", row["info_traffic"]),
("Traffic Ratio", row["info_ratio"]),
("Exchanges", len(row["netixlan_set"])),
("Facilities", len(row["netfac_set"])),
]
)
)
return download_data
def generate_fac(self, request):
"""
Fetch facility data from the api according to request and then render
it ready for export
Arguments:
- request <Request>
Returns:
- list: list containing rendered data ready for export
"""
data = self.fetch(request)
download_data = []
for row in data:
download_data.append(
collections.OrderedDict(
[
("Name", row["name"]),
("Management", row["org_name"]),
("CLLI", row["clli"]),
("NPA-NXX", row["npanxx"]),
("City", row["city"]),
("Country", row["country"]),
("State", row["state"]),
("Postal Code", row["zipcode"]),
("Networks", row["net_count"]),
]
)
)
return download_data
def generate_ix(self, request):
"""
Fetch exchange data from the api according to request and then render
it ready for export
Arguments:
- request <Request>
Returns:
- list: list containing rendered data ready for export
"""
data = self.fetch(request)
download_data = []
for row in data:
download_data.append(
collections.OrderedDict(
[
("Name", row["name"]),
("Media Type", row["media"]),
("Country", row["country"]),
("City", row["city"]),
("Networks", row["net_count"]),
]
)
)
return download_data
def generate_org(self, request):
"""
Fetch organization data from the api according to request and then render
it ready for export
Arguments:
- request <Request>
Returns:
- list: list containing rendered data ready for export
"""
data = self.fetch(request)
download_data = []
for row in data:
download_data.append(
collections.OrderedDict(
[
("Name", row["name"]),
("Country", row["country"]),
("City", row["city"]),
]
)
)
return download_data