Pymygdala

pymygdala API

Pymygdala: a CAPO aware Python library built atop pika for interacting with RabbitMQ, plus assorted simple command line applications built on the library.

Using this library you can log messages to RabbitMQ with a standard-isg logging handler and the Python logging package, you can monitor a RabbitMQ server and dump out the logs, and you can send various events to the server. Many (most?) of the defaults for things like routing keys, exchange names and CAPO properties are heavily NRAO-centric but you can order-ride them if you want.

class pymygdala.LogDumper(**kwargs)[source]

Dump the logs from a RabbitMQ server.

Keyword Arguments:
 

profile (“string”): the name of the profile to use, e.g. ‘test’, ‘production’. If missing this defaults to the CAPO_PROFILE environment variable, if that is missing as well it explodes, throwing a ValueError exception. A profile should be a simple word, one without spaces or tabs or evil things in it. This is not checked and no guarantee is provided.

hostname (“string”): name of the AMQP server to use, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.hostname”.

username (“string”): username to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.username”.

password (“string”): password to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.password”.

port (“int”): port number to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.port”.

connection_attempts (int): maximum number of retry attempts.

retry_delay (int|float): time to wait in seconds, before the next.

socket_timeout (int|float): use for high latency networks.

exchange (“string”): the exchange to use, defaults to pymygdala.LOG_EXCHANGE.

exchange_type (“string”): type of exchange to use, defaults to pymygdala.LOG_EXCHANGE_TYPE.

outfile (“string”): the name of the file to write to, default ‘-‘ for sys.stdout.

Example:
>>> # Dump the 'test' logs to stdout
>>> from pymygdala import LogDumper
>>> dumper = LogDumper(profile='test')
>>> dumper.dump()
dump()[source]

I need more docstrings.

class pymygdala.LogHandler(**kwargs)[source]

Logging handler for a RabbitMQ server.

Keyword Arguments:
 

profile (“string”): the name of the profile to use, e.g. ‘test’, ‘production’. If missing this defaults to the CAPO_PROFILE environment variable, if that is missing as well it explodes, throwing a ValueError exception. A profile should be a simple word, one without spaces or tabs or evil things in it. This is not checked and no guarantee is provided.

hostname (“string”): name of the AMQP server to use, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.hostname”.

username (“string”): username to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.username”.

password (“string”): password to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.password”.

port (“int”): port number to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.port”.

connection_attempts (int): maximum number of retry attempts.

retry_delay (int|float): time to wait in seconds, before the next.

socket_timeout (int|float): use for high latency networks.

exchange (“string”): the exchange to use, defaults to pymygdala.LOG_EXCHANGE.

exchange_type (“string”): type of exchange to use, defaults to pymygdala.LOG_EXCHANGE_TYPE.

application (“string”): the name of the application sending the event or message, “unknown” by default.

level (“level”): the logging level to use.

Example:
>>> import logging
>>> from pymygdala import LogHandler
>>> log = logging.getLogger(__name__)
>>> handler = LogHandler(profile='test', application='test-app', level=logging.DEBUG)
>>> log.addHandler(handler)
>>> log.error("my hovercraft is full of eels")
acquire()

Acquire the I/O thread lock.

addFilter(filter)

Add the specified filter to this handler.

close()

Tidy up any resources used by the handler.

This version removes the handler from an internal map of handlers, _handlers, which is used for handler lookup by name. Subclasses should ensure that this gets called from overridden close() methods.

createLock()

Acquire a thread lock for serializing access to the underlying I/O.

emit(record)[source]

Emit a logging message, build the routing key per emission because it has the logging level as part of it.

filter(record)

Determine if a record is loggable by consulting all the filters.

The default is to allow the record to be logged; any filter can veto this and the record is then dropped. Returns a zero value if a record is to be dropped, else non-zero.

Changed in version 3.2: Allow filters to be just callables.

flush()

Ensure all logging output has been flushed.

This version does nothing and is intended to be implemented by subclasses.

format(record)

Format the specified record.

If a formatter is set, use it. Otherwise, use the default formatter for the module.

get_name()
handle(record)

Conditionally emit the specified logging record.

Emission depends on filters which may have been added to the handler. Wrap the actual emission of the record with acquisition/release of the I/O thread lock. Returns whether the filter passed the record for emission.

handleError(record)

Handle errors which occur during an emit() call.

This method should be called from handlers when an exception is encountered during an emit() call. If raiseExceptions is false, exceptions get silently ignored. This is what is mostly wanted for a logging system - most users will not care about errors in the logging system, they are more interested in application errors. You could, however, replace this with a custom handler if you wish. The record which was being processed is passed in to this method.

name
record_to_dict(record)[source]

Turn a Python logging.LogRecord into a simple dictionary, with the structure we expect, that can easily be turned into a JSON string and sent off to RabbitMQ.

release()

Release the I/O thread lock.

removeFilter(filter)

Remove the specified filter from this handler.

setFormatter(fmt)

Set the formatter for this handler.

setLevel(level)

Set the logging level of this handler. level must be an int or a str.

set_name(name)
class pymygdala.SendEvent(**kwargs)[source]
Keyword Arguments:
 

profile (“string”): the name of the profile to use, e.g. ‘test’, ‘production’. If missing this defaults to the CAPO_PROFILE environment variable, if that is missing as well it explodes, throwing a ValueError exception. A profile should be a simple word, one without spaces or tabs or evil things in it. This is not checked and no guarantee is provided.

application (“string”): the name of the application sending the event or message, “unknown” by default.

hostname (“string”): name of the AMQP server to use, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.hostname”

username (“string”): username to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.username”.

password (“string”): password to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.password”.

port (“int”): port number to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.port”.

connection_attempts (int): maximum number of retry attempts

retry_delay (int|float): time to wait in seconds, before the next

socket_timeout (int|float): use for high latency networks

exchange (“string”): the exchange to use, defaults to pymygdala.EVENT_EXCHANGE

exchange_type (“string”): type of exchange to use, defaults to pymygdala.EVENT_EXCHANGE_TYPE.

routing_key (“string”): how messages will be routed to queues, format defaults to pymygdala.EVENT_ROUTING_KEY_FORMAT

Example:
>>> # Send the kind of event the solr indexer does when it starts to index
>>> from pymygdala import SendEvent
>>> se = SendEvent(profile='test', application='solr-indexer')
>>> event = {
...     "logData": {
...         "dryRun": False,
...         "numberOfDocuments": 9999
...     },
...     "message": "Starting reindexing process",
...     "request": "some request"
... }
>>> se.send(event)
send(event, routing_key=None, headers=None)[source]

Send an event to RabbitMQ: an ‘event’ here is a simple dictionary that can be converted into a JSON string.

Required Arguments:
 event (“dict”): a dictionary representing the event to send.
Optional Arguments:
 routing_key (“string”): the routing key of the message, defaults to application.event headers (“dict”): a dictionary of headers to send with the event as ‘properties’
class pymygdala.SendNRAOEvent(**kwargs)[source]

This subclasses SendEvent and just add some more fields to the event based on the exchange, so you can use it to trigger AAT/PPI workflows more easily.

Keyword Arguments:
 

profile (“string”): the name of the profile to use, e.g. ‘test’, ‘production’. If missing this defaults to the CAPO_PROFILE environment variable, if that is missing as well it explodes, throwing a ValueError exception. A profile should be a simple word, one without spaces or tabs or evil things in it. This is not checked and no guarantee is provided.

application (“string”): the name of the application sending the event or message, “unknown” by default.

hostname (“string”): name of the AMQP server to use, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.hostname”

username (“string”): username to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.username”.

password (“string”): password to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.password”.

port (“int”): port number to connect to AMQP with, defaults to CAPO property “edu.nrao.archive.configuration.AmqpServer.port”.

connection_attempts (int): maximum number of retry attempts

retry_delay (int|float): time to wait in seconds, before the next

socket_timeout (int|float): use for high latency networks

exchange (“string”): the exchange to use, defaults to pymygdala.EVENT_EXCHANGE

exchange_type (“string”): type of exchange to use, defaults to pymygdala.EVENT_EXCHANGE_TYPE.

routing_key (“string”): how messages will be routed to queues, format defaults to pymygdala.EVENT_ROUTING_KEY_FORMAT

Example:
>>> # Send the kind of event the solr indexer does when it starts to index
>>> from pymygdala import SendNRAOEvent
>>> se = SendEvent(profile='test', application='solr-indexer')
>>> headers = {
...     "reply-to": "jim@bob.foo"
... }
>>> event = {
...     "logData": {
...         "dryRun": False,
...         "numberOfDocuments": 9999
...     },
...     "message": "Starting reindexing process",
...     "request": "some request"
... }
>>> se.send(event, headers=headers)
send(event, routing_key=None, headers=None)

Send an event to RabbitMQ: an ‘event’ here is a simple dictionary that can be converted into a JSON string.

Required Arguments:
 event (“dict”): a dictionary representing the event to send.
Optional Arguments:
 routing_key (“string”): the routing key of the message, defaults to application.event headers (“dict”): a dictionary of headers to send with the event as ‘properties’

pymygdala CLI utilities

pym-dumplogs

pym-dumplogs, 0.2.0: connect to a RabbitMQ server and dump the logs out.

usage: pym-dumplogs [-h] [--profile PROFILE] [--exchange EXCHANGE]
                    [--type EXCHANGE_TYPE] [--hostname HOSTNAME]
                    [--username USERNAME] [--password PASSWORD] [--port PORT]
                    [--key ROUTING_KEY] [--outfile OUTFILE]

Named Arguments

--profile CAPO profile name to use, e.g. test, production
--exchange exchange name to use, e.g. archive.logs
--type exchange type to use, e.g. fanout
--hostname server name of the RabbitMQ server
--username username of the RabbitMQ server
--password password of the RabbitMQ server
--port port number of the RabbitMQ server
--key routing key, e.g. archive.logs
--outfile write output to file, - for STDOUT

Return values:
0: everything worked as expected 1: missing CAPO profile, no –profile argument or environment variable 2: can’t connect to specified RabbitMQ server

pym-sendlog

pym-sendlog, 0.2.0: a command line tool for logging to RabbitMQ.

usage: pym-sendlog [-h] [--profile PROFILE] [--exchange EXCHANGE]
                   [--type EXCHANGE_TYPE] [--hostname HOSTNAME]
                   [--username USERNAME] [--password PASSWORD] [--port PORT]
                   [--key ROUTING_KEY] --level LEVEL --message MESSAGE --app
                   APPLICATION

Named Arguments

--profile CAPO profile name to use, e.g. test, production
--exchange exchange name to use, e.g. archive.logs
--type exchange type to use, e.g. fanout
--hostname server name of the RabbitMQ server
--username username of the RabbitMQ server
--password password of the RabbitMQ server
--port port number of the RabbitMQ server
--key routing key, e.g. archive.logs

Required Arguments

--level logging level, e.g. DEBUG, WARN
--message message to log
--app the application name to log as

Return values:
0: everything worked as expected 1: missing CAPO profile, no ‘–profile’ argument or environment variable 2: can’t connect to specified RabbitMQ server 3: unknown logging level, should be DEBUG, INFO, WARN or ERROR

pym-sendevent

pym-sendevent, 0.2.0: send an event to a RabbitMQ server.

usage: pym-sendevent [-h] [--profile PROFILE] [--exchange EXCHANGE]
                     [--type EXCHANGE_TYPE] [--hostname HOSTNAME]
                     [--username USERNAME] [--password PASSWORD] [--port PORT]
                     [--key ROUTING_KEY] [--event EVENTNAME] [--nrao]
                     [--message MESSAGE] [--md-name MD_NAME]
                     [--metadata METADATA] --app APPLICATION

Named Arguments

--profile CAPO profile name to use, e.g. test, production
--exchange exchange name to use, e.g. archive.logs
--type exchange type to use, e.g. fanout
--hostname server name of the RabbitMQ server
--username username of the RabbitMQ server
--password password of the RabbitMQ server
--port port number of the RabbitMQ server
--key routing key, e.g. archive.logs
--event eventName to send, e.g. runSdmIngestionWorkflow
--nrao add NRAO specific fields to the event
--message provide a KEY=VALUE pair in the event message, can be used multiple times
--md-name name of an extra metadata message section
--metadata provide a KEY=VALUE pair in the md-name section of the message; can be used multiple times

Required Arguments

--app the application name to log as

Return values:
0: everything worked as expected 1: missing CAPO profile, no ‘–profile’ argument or environment variable 2: can’t connect to specified RabbitMQ server

Indices and tables