partition
stringclasses
3 values
func_name
stringlengths
1
134
docstring
stringlengths
1
46.9k
path
stringlengths
4
223
original_string
stringlengths
75
104k
code
stringlengths
75
104k
docstring_tokens
listlengths
1
1.97k
repo
stringlengths
7
55
language
stringclasses
1 value
url
stringlengths
87
315
code_tokens
listlengths
19
28.4k
sha
stringlengths
40
40
valid
Api.pull_stream
This will try to pull in a stream from an external source. Once a stream has been successfully pulled it is assigned a 'local stream name' which can be used to access the stream from the EMS. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param forceTcp: If 1 and if the stream is RTSP, a TCP connection will be forced. Otherwise the transport mechanism will be negotiated (UDP or TCP) (default: 1 true) :type forceTcp: int :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param rangeStart: For RTSP and RTMP connections. A value from which the playback should start expressed in seconds. There are 2 special values: -2 and -1. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeStart: int :param rangeEnd: The length in seconds for the playback. -1 is a special value. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeEnd: int :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param rtcpDetectionInterval: How much time (in seconds) should the server wait for RTCP packets before declaring the RTSP stream as a RTCP-less stream :type rtcpDetectionInterval: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param isAudio: If 1 and if the stream is RTP, it indicates that the currently pulled stream is an audio source. Otherwise the pulled source is assumed as a video source :type isAudio: int :param audioCodecBytes: The audio codec setup of this RTP stream if it is audio. Represented as hex format without '0x' or 'h'. For example: audioCodecBytes=1190 :type audioCodecBytes: str :param spsBytes: The video SPS bytes of this RTP stream if it is video. It should be base 64 encoded. :type spsBytes: str :param ppsBytes: The video PPS bytes of this RTP stream if it is video. It should be base 64 encoded :type ppsBytes: str :param ssmIp: The source IP from source-specific-multicast. Only usable when doing UDP based pull :type ssmIp: str :param httpProxy: This parameter has two valid values: IP:Port - This value combination specifies an RTSP HTTP Proxy from which the RTSP stream should be pulled from Self - Specifying "self" as the value implies pulling RTSP over HTTP :type httpProxy: str :link: http://docs.evostream.com/ems_api_definition/pullstream
pyems/__init__.py
def pull_stream(self, uri, **kwargs): """ This will try to pull in a stream from an external source. Once a stream has been successfully pulled it is assigned a 'local stream name' which can be used to access the stream from the EMS. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param forceTcp: If 1 and if the stream is RTSP, a TCP connection will be forced. Otherwise the transport mechanism will be negotiated (UDP or TCP) (default: 1 true) :type forceTcp: int :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param rangeStart: For RTSP and RTMP connections. A value from which the playback should start expressed in seconds. There are 2 special values: -2 and -1. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeStart: int :param rangeEnd: The length in seconds for the playback. -1 is a special value. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeEnd: int :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param rtcpDetectionInterval: How much time (in seconds) should the server wait for RTCP packets before declaring the RTSP stream as a RTCP-less stream :type rtcpDetectionInterval: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param isAudio: If 1 and if the stream is RTP, it indicates that the currently pulled stream is an audio source. Otherwise the pulled source is assumed as a video source :type isAudio: int :param audioCodecBytes: The audio codec setup of this RTP stream if it is audio. Represented as hex format without '0x' or 'h'. For example: audioCodecBytes=1190 :type audioCodecBytes: str :param spsBytes: The video SPS bytes of this RTP stream if it is video. It should be base 64 encoded. :type spsBytes: str :param ppsBytes: The video PPS bytes of this RTP stream if it is video. It should be base 64 encoded :type ppsBytes: str :param ssmIp: The source IP from source-specific-multicast. Only usable when doing UDP based pull :type ssmIp: str :param httpProxy: This parameter has two valid values: IP:Port - This value combination specifies an RTSP HTTP Proxy from which the RTSP stream should be pulled from Self - Specifying "self" as the value implies pulling RTSP over HTTP :type httpProxy: str :link: http://docs.evostream.com/ems_api_definition/pullstream """ return self.protocol.execute('pullStream', uri=uri, **kwargs)
def pull_stream(self, uri, **kwargs): """ This will try to pull in a stream from an external source. Once a stream has been successfully pulled it is assigned a 'local stream name' which can be used to access the stream from the EMS. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param forceTcp: If 1 and if the stream is RTSP, a TCP connection will be forced. Otherwise the transport mechanism will be negotiated (UDP or TCP) (default: 1 true) :type forceTcp: int :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param rangeStart: For RTSP and RTMP connections. A value from which the playback should start expressed in seconds. There are 2 special values: -2 and -1. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeStart: int :param rangeEnd: The length in seconds for the playback. -1 is a special value. For more information, please read about start/len parameters here: http://livedocs.adobe.com/flashmediaserver/3.0/hpdocs/help.html?content=00000185.html :type rangeEnd: int :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param rtcpDetectionInterval: How much time (in seconds) should the server wait for RTCP packets before declaring the RTSP stream as a RTCP-less stream :type rtcpDetectionInterval: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param isAudio: If 1 and if the stream is RTP, it indicates that the currently pulled stream is an audio source. Otherwise the pulled source is assumed as a video source :type isAudio: int :param audioCodecBytes: The audio codec setup of this RTP stream if it is audio. Represented as hex format without '0x' or 'h'. For example: audioCodecBytes=1190 :type audioCodecBytes: str :param spsBytes: The video SPS bytes of this RTP stream if it is video. It should be base 64 encoded. :type spsBytes: str :param ppsBytes: The video PPS bytes of this RTP stream if it is video. It should be base 64 encoded :type ppsBytes: str :param ssmIp: The source IP from source-specific-multicast. Only usable when doing UDP based pull :type ssmIp: str :param httpProxy: This parameter has two valid values: IP:Port - This value combination specifies an RTSP HTTP Proxy from which the RTSP stream should be pulled from Self - Specifying "self" as the value implies pulling RTSP over HTTP :type httpProxy: str :link: http://docs.evostream.com/ems_api_definition/pullstream """ return self.protocol.execute('pullStream', uri=uri, **kwargs)
[ "This", "will", "try", "to", "pull", "in", "a", "stream", "from", "an", "external", "source", ".", "Once", "a", "stream", "has", "been", "successfully", "pulled", "it", "is", "assigned", "a", "local", "stream", "name", "which", "can", "be", "used", "to", "access", "the", "stream", "from", "the", "EMS", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L23-L119
[ "def", "pull_stream", "(", "self", ",", "uri", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'pullStream'", ",", "uri", "=", "uri", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.push_stream
Try to push a local stream to an external destination. The pushed stream can only use the RTMP, RTSP or MPEG-TS unicast/multicast protocol. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param targetStreamName: The name of the stream at destination. If not provided, the target stream name willbe the same as the local stream name :type targetStreamName: str :param targetStreamType: It can be one of following: **live**, **record**, **append**. It is meaningful only for RTMP :type targetStreamType: str :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param rtmpAbsoluteTimestamps: Forces the timestamps to be absolute when using RTMP. :type rtmpAbsoluteTimestamps: int :param sendChunkSizeRequest: Sets whether the RTMP stream will or will not send a "Set Chunk Length" message. This is significant when pushing to Akamai's new RTMP HD ingest point where this parameter should be set to 0 so that Akamai will not drop the connection. :type sendChunkSizeRequest: int :param useSourcePts: When value is true, timestamps on source inbound RTMP stream are passed directly to the outbound (pushed) RTMP streams. This affects only pushed Outbound Net RTMP with net RTMP source. This parameter overrides the value of the config.lua option of the same name. :type useSourcePts: int :link: http://docs.evostream.com/ems_api_definition/pushstream
pyems/__init__.py
def push_stream(self, uri, **kwargs): """ Try to push a local stream to an external destination. The pushed stream can only use the RTMP, RTSP or MPEG-TS unicast/multicast protocol. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param targetStreamName: The name of the stream at destination. If not provided, the target stream name willbe the same as the local stream name :type targetStreamName: str :param targetStreamType: It can be one of following: **live**, **record**, **append**. It is meaningful only for RTMP :type targetStreamType: str :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param rtmpAbsoluteTimestamps: Forces the timestamps to be absolute when using RTMP. :type rtmpAbsoluteTimestamps: int :param sendChunkSizeRequest: Sets whether the RTMP stream will or will not send a "Set Chunk Length" message. This is significant when pushing to Akamai's new RTMP HD ingest point where this parameter should be set to 0 so that Akamai will not drop the connection. :type sendChunkSizeRequest: int :param useSourcePts: When value is true, timestamps on source inbound RTMP stream are passed directly to the outbound (pushed) RTMP streams. This affects only pushed Outbound Net RTMP with net RTMP source. This parameter overrides the value of the config.lua option of the same name. :type useSourcePts: int :link: http://docs.evostream.com/ems_api_definition/pushstream """ return self.protocol.execute('pushStream', uri=uri, **kwargs)
def push_stream(self, uri, **kwargs): """ Try to push a local stream to an external destination. The pushed stream can only use the RTMP, RTSP or MPEG-TS unicast/multicast protocol. :param uri: The URI of the external stream. Can be RTMP, RTSP or unicast/multicast (d) mpegts :type uri: str :param keepAlive: If keepAlive is set to 1, the server will attempt to reestablish connection with a stream source after a connection has been lost. The reconnect will be attempted once every second (default: 1 true) :type keepAlive: int :param localStreamName: If provided, the stream will be given this name. Otherwise, a fallback techniques used to determine the stream name (based on the URI) :type localStreamName: str :param targetStreamName: The name of the stream at destination. If not provided, the target stream name willbe the same as the local stream name :type targetStreamName: str :param targetStreamType: It can be one of following: **live**, **record**, **append**. It is meaningful only for RTMP :type targetStreamType: str :param tcUrl: When specified, this value will be used to set the TC URL in the initial RTMP connect invoke :type tcUrl: str :param pageUrl: When specified, this value will be used to set the originating web page address in the initial RTMP connect invoke :type pageUrl: str :param swfUrl: When specified, this value will be used to set the originating swf URL in the initial RTMP connect invoke :type swfUrl: str :param ttl: Sets the IP_TTL (time to live) option on the socket :type ttl: int :param tos: Sets the IP_TOS (Type of Service) option on the socket :type tos: int :param emulateUserAgent: When specified, this value will be used as the user agent string. It is meaningful only for RTMP :type emulateUserAgent: str :param rtmpAbsoluteTimestamps: Forces the timestamps to be absolute when using RTMP. :type rtmpAbsoluteTimestamps: int :param sendChunkSizeRequest: Sets whether the RTMP stream will or will not send a "Set Chunk Length" message. This is significant when pushing to Akamai's new RTMP HD ingest point where this parameter should be set to 0 so that Akamai will not drop the connection. :type sendChunkSizeRequest: int :param useSourcePts: When value is true, timestamps on source inbound RTMP stream are passed directly to the outbound (pushed) RTMP streams. This affects only pushed Outbound Net RTMP with net RTMP source. This parameter overrides the value of the config.lua option of the same name. :type useSourcePts: int :link: http://docs.evostream.com/ems_api_definition/pushstream """ return self.protocol.execute('pushStream', uri=uri, **kwargs)
[ "Try", "to", "push", "a", "local", "stream", "to", "an", "external", "destination", ".", "The", "pushed", "stream", "can", "only", "use", "the", "RTMP", "RTSP", "or", "MPEG", "-", "TS", "unicast", "/", "multicast", "protocol", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L125-L196
[ "def", "push_stream", "(", "self", ",", "uri", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'pushStream'", ",", "uri", "=", "uri", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.create_hls_stream
Create an HTTP Live Stream (HLS) out of an existing H.264/AAC stream. HLS is used to stream live feeds to iOS devices such as iPhones and iPads. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the .ts/.m3u8 files will be stored. This folder must be accessible by the HLS clients. It is usually in the web-root of the server. :type targetFolder: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will force overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all *.ts and *.m3u8 files in the target folder will be removed before HLS creation is started. :type cleanupDestination: int :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param groupName: The name assigned to the HLS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The length (number of elements) of the playlist. Used only when playlistType is rolling. Ignored otherwise. :type playlistLength: int :param playlistName: The file name of the playlist (*.m3u8). :type playlistName: str :param chunkLength: The length (in seconds) of each playlist element (*.ts file). Minimum value is 1 (second). :type chunkLength: int :param maxChunkLength: Maximum length (in seconds) the EMS will allow any single chunk to be. This is primarily in the case of chunkOnIDR=true where the EMS will wait for the next key-frame. If the maxChunkLength is less than chunkLength, the parameter shall be ignored. :type maxChunkLength: int :param chunkBaseName: The base name used to generate the *.ts chunks. :type chunkBaseName: str :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param drmType: Type of DRM encryption to use. Options are: none (no encryption), evo (AES Encryption), SAMPLE-AES (Sample-AES), verimatrix (Verimatrix DRM). For Verimatrix DRM, the "drm" section of the config.lua file must be active and properly configured. :type drmType: str :param AESKeyCount: Number of keys that will be automatically generated and rotated over while encrypting this HLS stream. :type AESKeyCount: int :param audioOnly: If true, stream will be audio only. :type audioOnly: int :param hlsResume: If true, HLS will resume in appending segments to previously created child playlist even in cases of EMS shutdown or cut off stream source. :type hlsResume: int :param cleanupOnClose: If true, corresponding hls files to a stream will be deleted if the said stream is removed or shut down or disconnected. :type cleanupOnClose: int :param useByteRange: If true, will use the EXT-X-BYTERANGE feature of HLS (version 4 and up). :type useByteRange: int :param fileLength: When using useByteRange=1, this parameter needs to be set too. This will be the size of file before chunking it to another file, this replace the chunkLength in case of EXT-X-BYTERANGE, since chunkLength will be the byte range chunk. :type fileLength: int :param useSystemTime: If true, uses UTC in playlist time stamp otherwise will use the local server time. :type useSystemTime: int :param offsetTime: :type offsetTime: int :param startOffset: A parameter valid only for HLS v.6 onwards. This will indicate the start offset time (in seconds) for the playback of the playlist. :type startOffset: int :link: http://docs.evostream.com/ems_api_definition/createhlsstream
pyems/__init__.py
def create_hls_stream(self, localStreamNames, targetFolder, **kwargs): """ Create an HTTP Live Stream (HLS) out of an existing H.264/AAC stream. HLS is used to stream live feeds to iOS devices such as iPhones and iPads. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the .ts/.m3u8 files will be stored. This folder must be accessible by the HLS clients. It is usually in the web-root of the server. :type targetFolder: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will force overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all *.ts and *.m3u8 files in the target folder will be removed before HLS creation is started. :type cleanupDestination: int :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param groupName: The name assigned to the HLS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The length (number of elements) of the playlist. Used only when playlistType is rolling. Ignored otherwise. :type playlistLength: int :param playlistName: The file name of the playlist (*.m3u8). :type playlistName: str :param chunkLength: The length (in seconds) of each playlist element (*.ts file). Minimum value is 1 (second). :type chunkLength: int :param maxChunkLength: Maximum length (in seconds) the EMS will allow any single chunk to be. This is primarily in the case of chunkOnIDR=true where the EMS will wait for the next key-frame. If the maxChunkLength is less than chunkLength, the parameter shall be ignored. :type maxChunkLength: int :param chunkBaseName: The base name used to generate the *.ts chunks. :type chunkBaseName: str :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param drmType: Type of DRM encryption to use. Options are: none (no encryption), evo (AES Encryption), SAMPLE-AES (Sample-AES), verimatrix (Verimatrix DRM). For Verimatrix DRM, the "drm" section of the config.lua file must be active and properly configured. :type drmType: str :param AESKeyCount: Number of keys that will be automatically generated and rotated over while encrypting this HLS stream. :type AESKeyCount: int :param audioOnly: If true, stream will be audio only. :type audioOnly: int :param hlsResume: If true, HLS will resume in appending segments to previously created child playlist even in cases of EMS shutdown or cut off stream source. :type hlsResume: int :param cleanupOnClose: If true, corresponding hls files to a stream will be deleted if the said stream is removed or shut down or disconnected. :type cleanupOnClose: int :param useByteRange: If true, will use the EXT-X-BYTERANGE feature of HLS (version 4 and up). :type useByteRange: int :param fileLength: When using useByteRange=1, this parameter needs to be set too. This will be the size of file before chunking it to another file, this replace the chunkLength in case of EXT-X-BYTERANGE, since chunkLength will be the byte range chunk. :type fileLength: int :param useSystemTime: If true, uses UTC in playlist time stamp otherwise will use the local server time. :type useSystemTime: int :param offsetTime: :type offsetTime: int :param startOffset: A parameter valid only for HLS v.6 onwards. This will indicate the start offset time (in seconds) for the playback of the playlist. :type startOffset: int :link: http://docs.evostream.com/ems_api_definition/createhlsstream """ return self.protocol.execute('createhlsstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
def create_hls_stream(self, localStreamNames, targetFolder, **kwargs): """ Create an HTTP Live Stream (HLS) out of an existing H.264/AAC stream. HLS is used to stream live feeds to iOS devices such as iPhones and iPads. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the .ts/.m3u8 files will be stored. This folder must be accessible by the HLS clients. It is usually in the web-root of the server. :type targetFolder: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will force overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all *.ts and *.m3u8 files in the target folder will be removed before HLS creation is started. :type cleanupDestination: int :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param groupName: The name assigned to the HLS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The length (number of elements) of the playlist. Used only when playlistType is rolling. Ignored otherwise. :type playlistLength: int :param playlistName: The file name of the playlist (*.m3u8). :type playlistName: str :param chunkLength: The length (in seconds) of each playlist element (*.ts file). Minimum value is 1 (second). :type chunkLength: int :param maxChunkLength: Maximum length (in seconds) the EMS will allow any single chunk to be. This is primarily in the case of chunkOnIDR=true where the EMS will wait for the next key-frame. If the maxChunkLength is less than chunkLength, the parameter shall be ignored. :type maxChunkLength: int :param chunkBaseName: The base name used to generate the *.ts chunks. :type chunkBaseName: str :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param drmType: Type of DRM encryption to use. Options are: none (no encryption), evo (AES Encryption), SAMPLE-AES (Sample-AES), verimatrix (Verimatrix DRM). For Verimatrix DRM, the "drm" section of the config.lua file must be active and properly configured. :type drmType: str :param AESKeyCount: Number of keys that will be automatically generated and rotated over while encrypting this HLS stream. :type AESKeyCount: int :param audioOnly: If true, stream will be audio only. :type audioOnly: int :param hlsResume: If true, HLS will resume in appending segments to previously created child playlist even in cases of EMS shutdown or cut off stream source. :type hlsResume: int :param cleanupOnClose: If true, corresponding hls files to a stream will be deleted if the said stream is removed or shut down or disconnected. :type cleanupOnClose: int :param useByteRange: If true, will use the EXT-X-BYTERANGE feature of HLS (version 4 and up). :type useByteRange: int :param fileLength: When using useByteRange=1, this parameter needs to be set too. This will be the size of file before chunking it to another file, this replace the chunkLength in case of EXT-X-BYTERANGE, since chunkLength will be the byte range chunk. :type fileLength: int :param useSystemTime: If true, uses UTC in playlist time stamp otherwise will use the local server time. :type useSystemTime: int :param offsetTime: :type offsetTime: int :param startOffset: A parameter valid only for HLS v.6 onwards. This will indicate the start offset time (in seconds) for the playback of the playlist. :type startOffset: int :link: http://docs.evostream.com/ems_api_definition/createhlsstream """ return self.protocol.execute('createhlsstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
[ "Create", "an", "HTTP", "Live", "Stream", "(", "HLS", ")", "out", "of", "an", "existing", "H", ".", "264", "/", "AAC", "stream", ".", "HLS", "is", "used", "to", "stream", "live", "feeds", "to", "iOS", "devices", "such", "as", "iPhones", "and", "iPads", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L206-L331
[ "def", "create_hls_stream", "(", "self", ",", "localStreamNames", ",", "targetFolder", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'createhlsstream'", ",", "localStreamNames", "=", "localStreamNames", ",", "targetFolder", "=", "targetFolder", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.create_hds_stream
Create an HDS (HTTP Dynamic Streaming) stream out of an existing H.264/AAC stream. HDS is used to stream standard MP4 media over regular HTTP connections. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest (*.f4m) and fragment (f4v*) files will be stored. This folder must be accessible by the HDS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param chunkBaseName: The base name used to generate the fragments. :type chunkBaseName: str :param chunkLength: The length (in seconds) of fragments to be made. Minimum value is 1 (second). :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param groupName: The name assigned to the HDS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param manifestName: The manifest file name. :type manifestName: str :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when playlistType is "rolling". Ignored otherwise. :type playlistLength: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before HDS creation is started. :type cleanupDestination: int :link: http://docs.evostream.com/ems_api_definition/createhdsstream
pyems/__init__.py
def create_hds_stream(self, localStreamNames, targetFolder, **kwargs): """ Create an HDS (HTTP Dynamic Streaming) stream out of an existing H.264/AAC stream. HDS is used to stream standard MP4 media over regular HTTP connections. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest (*.f4m) and fragment (f4v*) files will be stored. This folder must be accessible by the HDS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param chunkBaseName: The base name used to generate the fragments. :type chunkBaseName: str :param chunkLength: The length (in seconds) of fragments to be made. Minimum value is 1 (second). :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param groupName: The name assigned to the HDS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param manifestName: The manifest file name. :type manifestName: str :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when playlistType is "rolling". Ignored otherwise. :type playlistLength: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before HDS creation is started. :type cleanupDestination: int :link: http://docs.evostream.com/ems_api_definition/createhdsstream """ return self.protocol.execute('createhdsstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
def create_hds_stream(self, localStreamNames, targetFolder, **kwargs): """ Create an HDS (HTTP Dynamic Streaming) stream out of an existing H.264/AAC stream. HDS is used to stream standard MP4 media over regular HTTP connections. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest (*.f4m) and fragment (f4v*) files will be stored. This folder must be accessible by the HDS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in localStreamNames. Again, this can be a comma-delimited list. :type bandwidths: int :param chunkBaseName: The base name used to generate the fragments. :type chunkBaseName: str :param chunkLength: The length (in seconds) of fragments to be made. Minimum value is 1 (second). :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param groupName: The name assigned to the HDS stream or group. If the localStreamNames parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param manifestName: The manifest file name. :type manifestName: str :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param playlistType: Either appending or rolling. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when playlistType is "rolling". Ignored otherwise. :type playlistLength: int :param staleRetentionCount: The number of old files kept besides the ones listed in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param createMasterPlaylist: If true, a master playlist will be created. :type createMasterPlaylist: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before HDS creation is started. :type cleanupDestination: int :link: http://docs.evostream.com/ems_api_definition/createhdsstream """ return self.protocol.execute('createhdsstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
[ "Create", "an", "HDS", "(", "HTTP", "Dynamic", "Streaming", ")", "stream", "out", "of", "an", "existing", "H", ".", "264", "/", "AAC", "stream", ".", "HDS", "is", "used", "to", "stream", "standard", "MP4", "media", "over", "regular", "HTTP", "connections", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L338-L413
[ "def", "create_hds_stream", "(", "self", ",", "localStreamNames", ",", "targetFolder", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'createhdsstream'", ",", "localStreamNames", "=", "localStreamNames", ",", "targetFolder", "=", "targetFolder", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.create_mss_stream
Create a Microsoft Smooth Stream (MSS) out of an existing H.264/AAC stream. Smooth Streaming was developed by Microsoft to compete with other adaptive streaming technologies. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names) :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the MSS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the MSS stream or group. If the `localStreamNames` parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling` :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If 1 (true), chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If 1 (true), the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If 1 (true), it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If 1 (true), all manifest and fragment files in the target folder will be removed before MSS creation is started. :type cleanupDestination: int :param ismType: Either ismc for serving content to client or isml for serving content to smooth server. :type ismType: int :param isLive: If true, creates a live MSS stream, otherwise set to 0 for VOD. :type isLive: int :param publishingPoint: This parameter is needed when `ismType=isml`, it is the REST URI where the mss contents will be ingested. :type publishingPoint: str :param ingestMode: Either `single` for a non looping ingest or `loop` for looping an ingest. :type ingestMode: str :link: http://docs.evostream.com/ems_api_definition/createmssstream
pyems/__init__.py
def create_mss_stream(self, localStreamNames, targetFolder, **kwargs): """ Create a Microsoft Smooth Stream (MSS) out of an existing H.264/AAC stream. Smooth Streaming was developed by Microsoft to compete with other adaptive streaming technologies. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names) :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the MSS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the MSS stream or group. If the `localStreamNames` parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling` :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If 1 (true), chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If 1 (true), the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If 1 (true), it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If 1 (true), all manifest and fragment files in the target folder will be removed before MSS creation is started. :type cleanupDestination: int :param ismType: Either ismc for serving content to client or isml for serving content to smooth server. :type ismType: int :param isLive: If true, creates a live MSS stream, otherwise set to 0 for VOD. :type isLive: int :param publishingPoint: This parameter is needed when `ismType=isml`, it is the REST URI where the mss contents will be ingested. :type publishingPoint: str :param ingestMode: Either `single` for a non looping ingest or `loop` for looping an ingest. :type ingestMode: str :link: http://docs.evostream.com/ems_api_definition/createmssstream """ return self.protocol.execute('createmssstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
def create_mss_stream(self, localStreamNames, targetFolder, **kwargs): """ Create a Microsoft Smooth Stream (MSS) out of an existing H.264/AAC stream. Smooth Streaming was developed by Microsoft to compete with other adaptive streaming technologies. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names) :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the MSS clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the MSS stream or group. If the `localStreamNames` parameter contains only one entry and groupName is not specified, groupName will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling` :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If 1 (true), chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If 1 (true), the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If 1 (true), it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If 1 (true), all manifest and fragment files in the target folder will be removed before MSS creation is started. :type cleanupDestination: int :param ismType: Either ismc for serving content to client or isml for serving content to smooth server. :type ismType: int :param isLive: If true, creates a live MSS stream, otherwise set to 0 for VOD. :type isLive: int :param publishingPoint: This parameter is needed when `ismType=isml`, it is the REST URI where the mss contents will be ingested. :type publishingPoint: str :param ingestMode: Either `single` for a non looping ingest or `loop` for looping an ingest. :type ingestMode: str :link: http://docs.evostream.com/ems_api_definition/createmssstream """ return self.protocol.execute('createmssstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
[ "Create", "a", "Microsoft", "Smooth", "Stream", "(", "MSS", ")", "out", "of", "an", "existing", "H", ".", "264", "/", "AAC", "stream", ".", "Smooth", "Streaming", "was", "developed", "by", "Microsoft", "to", "compete", "with", "other", "adaptive", "streaming", "technologies", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L420-L503
[ "def", "create_mss_stream", "(", "self", ",", "localStreamNames", ",", "targetFolder", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'createmssstream'", ",", "localStreamNames", "=", "localStreamNames", ",", "targetFolder", "=", "targetFolder", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.create_dash_stream
Create Dynamic Adaptive Streaming over HTTP (DASH) out of an existing H.264/AAC stream. DASH was developed by the Moving Picture Experts Group (MPEG) to establish a standard for HTTP adaptive-bitrate streaming that would be accepted by multiple vendors and facilitate interoperability. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the DASH clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the DASH stream or group. If the `localStreamNames` parameter contains only one entry and `groupName` is not specified, `groupName` will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling`. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before DASH creation is started. :type cleanupDestination: int :param dynamicProfile: Set this parameter to 1 (default) for a live DASH, otherwise set it to 0 for a VOD. :type dynamicProfile: int :link: http://docs.evostream.com/ems_api_definition/createdashstream
pyems/__init__.py
def create_dash_stream(self, localStreamNames, targetFolder, **kwargs): """ Create Dynamic Adaptive Streaming over HTTP (DASH) out of an existing H.264/AAC stream. DASH was developed by the Moving Picture Experts Group (MPEG) to establish a standard for HTTP adaptive-bitrate streaming that would be accepted by multiple vendors and facilitate interoperability. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the DASH clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the DASH stream or group. If the `localStreamNames` parameter contains only one entry and `groupName` is not specified, `groupName` will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling`. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before DASH creation is started. :type cleanupDestination: int :param dynamicProfile: Set this parameter to 1 (default) for a live DASH, otherwise set it to 0 for a VOD. :type dynamicProfile: int :link: http://docs.evostream.com/ems_api_definition/createdashstream """ return self.protocol.execute('createdashstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
def create_dash_stream(self, localStreamNames, targetFolder, **kwargs): """ Create Dynamic Adaptive Streaming over HTTP (DASH) out of an existing H.264/AAC stream. DASH was developed by the Moving Picture Experts Group (MPEG) to establish a standard for HTTP adaptive-bitrate streaming that would be accepted by multiple vendors and facilitate interoperability. :param localStreamNames: The stream(s) that will be used as the input. This is a comma-delimited list of active stream names (local stream names). :type localStreamNames: str :param targetFolder: The folder where all the manifest and fragment files will be stored. This folder must be accessible by the DASH clients. It is usually in the web-root of the server. :type targetFolder: str :param bandwidths: The corresponding bandwidths for each stream listed in `localStreamNames`. Again, this can be a comma-delimited list. :type bandwidths: int or str :param groupName: The name assigned to the DASH stream or group. If the `localStreamNames` parameter contains only one entry and `groupName` is not specified, `groupName` will have the value of the input stream name. :type groupName: str :param playlistType: Either `appending` or `rolling`. :type playlistType: str :param playlistLength: The number of fragments before the server starts to overwrite the older fragments. Used only when `playlistType` is `rolling`. Ignored otherwise. :type playlistLength: int :param manifestName: The manifest file name. :type manifestName: str :param chunkLength: The length (in seconds) of fragments to be made. :type chunkLength: int :param chunkOnIDR: If true, chunking is performed ONLY on IDR. Otherwise, chunking is performed whenever chunk length is achieved. :type chunkOnIDR: int :param keepAlive: If true, the EMS will attempt to reconnect to the stream source if the connection is severed. :type keepAlive: int :param overwriteDestination: If true, it will allow overwrite of destination files. :type overwriteDestination: int :param staleRetentionCount: How many old files are kept besides the ones present in the current version of the playlist. Only applicable for rolling playlists. :type staleRetentionCount: int :param cleanupDestination: If true, all manifest and fragment files in the target folder will be removed before DASH creation is started. :type cleanupDestination: int :param dynamicProfile: Set this parameter to 1 (default) for a live DASH, otherwise set it to 0 for a VOD. :type dynamicProfile: int :link: http://docs.evostream.com/ems_api_definition/createdashstream """ return self.protocol.execute('createdashstream', localStreamNames=localStreamNames, targetFolder=targetFolder, **kwargs)
[ "Create", "Dynamic", "Adaptive", "Streaming", "over", "HTTP", "(", "DASH", ")", "out", "of", "an", "existing", "H", ".", "264", "/", "AAC", "stream", ".", "DASH", "was", "developed", "by", "the", "Moving", "Picture", "Experts", "Group", "(", "MPEG", ")", "to", "establish", "a", "standard", "for", "HTTP", "adaptive", "-", "bitrate", "streaming", "that", "would", "be", "accepted", "by", "multiple", "vendors", "and", "facilitate", "interoperability", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L509-L581
[ "def", "create_dash_stream", "(", "self", ",", "localStreamNames", ",", "targetFolder", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'createdashstream'", ",", "localStreamNames", "=", "localStreamNames", ",", "targetFolder", "=", "targetFolder", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.record
Records any inbound stream. The record command allows users to record a stream that may not yet exist. When a new stream is brought into the server, it is checked against a list of streams to be recorded. Streams can be recorded as FLV files, MPEG-TS files or as MP4 files. :param localStreamName: The name of the stream to be used as input for recording. :type localStreamName: str :param pathToFile: Specify path and file name to write to. :type pathToFile: str :param type: `ts`, `mp4` or `flv` :type type: str :param overwrite: If false, when a file already exists for the stream name, a new file will be created with the next appropriate number appended. If 1 (true), files with the same name will be overwritten. :type overwrite: int :param keepAlive: If 1 (true), the server will restart recording every time the stream becomes available again. :type keepAlive: int :param chunkLength: If non-zero the record command will start a new recording file after ChunkLength seconds have elapsed. :type chunkLength: int :param waitForIDR: This is used if the recording is being chunked. When true, new files will only be created on IDR boundaries. :type waitForIDR: int :param winQtCompat: Mandates 32bit header fields to ensure compatibility with Windows QuickTime. :type winQtCompat: int :param dateFolderStructure: If set to 1 (true), folders will be created with names in `YYYYMMDD` format. Recorded files will be placed inside these folders based on the date they were created. :type dateFolderStructure: int :link: http://docs.evostream.com/ems_api_definition/record
pyems/__init__.py
def record(self, localStreamName, pathToFile, **kwargs): """ Records any inbound stream. The record command allows users to record a stream that may not yet exist. When a new stream is brought into the server, it is checked against a list of streams to be recorded. Streams can be recorded as FLV files, MPEG-TS files or as MP4 files. :param localStreamName: The name of the stream to be used as input for recording. :type localStreamName: str :param pathToFile: Specify path and file name to write to. :type pathToFile: str :param type: `ts`, `mp4` or `flv` :type type: str :param overwrite: If false, when a file already exists for the stream name, a new file will be created with the next appropriate number appended. If 1 (true), files with the same name will be overwritten. :type overwrite: int :param keepAlive: If 1 (true), the server will restart recording every time the stream becomes available again. :type keepAlive: int :param chunkLength: If non-zero the record command will start a new recording file after ChunkLength seconds have elapsed. :type chunkLength: int :param waitForIDR: This is used if the recording is being chunked. When true, new files will only be created on IDR boundaries. :type waitForIDR: int :param winQtCompat: Mandates 32bit header fields to ensure compatibility with Windows QuickTime. :type winQtCompat: int :param dateFolderStructure: If set to 1 (true), folders will be created with names in `YYYYMMDD` format. Recorded files will be placed inside these folders based on the date they were created. :type dateFolderStructure: int :link: http://docs.evostream.com/ems_api_definition/record """ return self.protocol.execute('record', localStreamName=localStreamName, pathToFile=pathToFile, **kwargs)
def record(self, localStreamName, pathToFile, **kwargs): """ Records any inbound stream. The record command allows users to record a stream that may not yet exist. When a new stream is brought into the server, it is checked against a list of streams to be recorded. Streams can be recorded as FLV files, MPEG-TS files or as MP4 files. :param localStreamName: The name of the stream to be used as input for recording. :type localStreamName: str :param pathToFile: Specify path and file name to write to. :type pathToFile: str :param type: `ts`, `mp4` or `flv` :type type: str :param overwrite: If false, when a file already exists for the stream name, a new file will be created with the next appropriate number appended. If 1 (true), files with the same name will be overwritten. :type overwrite: int :param keepAlive: If 1 (true), the server will restart recording every time the stream becomes available again. :type keepAlive: int :param chunkLength: If non-zero the record command will start a new recording file after ChunkLength seconds have elapsed. :type chunkLength: int :param waitForIDR: This is used if the recording is being chunked. When true, new files will only be created on IDR boundaries. :type waitForIDR: int :param winQtCompat: Mandates 32bit header fields to ensure compatibility with Windows QuickTime. :type winQtCompat: int :param dateFolderStructure: If set to 1 (true), folders will be created with names in `YYYYMMDD` format. Recorded files will be placed inside these folders based on the date they were created. :type dateFolderStructure: int :link: http://docs.evostream.com/ems_api_definition/record """ return self.protocol.execute('record', localStreamName=localStreamName, pathToFile=pathToFile, **kwargs)
[ "Records", "any", "inbound", "stream", ".", "The", "record", "command", "allows", "users", "to", "record", "a", "stream", "that", "may", "not", "yet", "exist", ".", "When", "a", "new", "stream", "is", "brought", "into", "the", "server", "it", "is", "checked", "against", "a", "list", "of", "streams", "to", "be", "recorded", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L586-L635
[ "def", "record", "(", "self", ",", "localStreamName", ",", "pathToFile", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'record'", ",", "localStreamName", "=", "localStreamName", ",", "pathToFile", "=", "pathToFile", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.transcode
Changes the compression characteristics of an audio and/or video stream. Allows you to change the resolution of a source stream, change the bitrate of a stream, change a VP8 or MPEG2 stream into H.264 and much more. Allow users to create overlays on the final stream as well as crop streams. :param source: Can be a URI or a local stream name from EMS. :type source: str :param destinations: The target URI(s) or stream name(s) of the transcoded stream. If only a name is given, it will be pushed back to the EMS. :type destinations: str :param targetStreamNames: The name of the stream(s) at destination(s). If not specified, and a full URI is provided to destinations, name will have a time stamped value. :type targetStreamNames: str :param groupName: The group name assigned to this process. If not specified, groupName will have a random value. :type groupName: str :param videoBitrates: Target output video bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no video. :type videoBitrates: str :param videoSizes: Target output video size(s) in wxh (width x height) format. IE: 240x480. :type videoSizes: str :param videoAdvancedParamsProfiles: Name of video profile template that will be used. :type videoAdvancedParamsProfiles: str :param audioBitrates: Target output audio bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no audio. :type audioBitrates: str :param audioChannelsCounts: Target output audio channel(s) count(s). Valid values are 1 (mono), 2 (stereo), and so on. Actual supported channel count is dependent on the number of input audio channels. :type audioChannelsCounts: str :param audioFrequencies: Target output audio frequency(ies) (in Hz, append `k` to value for kHz). :type audioFrequencies: str :param audioAdvancedParamsProfiles: Name of audio profile template that will be used. :type audioAdvancedParamsProfiles: str :param overlays: Location of the overlay source(s) to be used. These are transparent images (normally in PNG format) that have the same or smaller size than the video. Image is placed at the top-left position of the video. :type overlays: str :param croppings: Target video cropping position(s) and size(s) in `left : top : width : height` format (e.g. 0:0:200:100. Positions are optional (200:100 for a centered cropping of 200 width and 100 height in pixels). Values are limited to the actual size of the video. :type croppings: str :param keepAlive: If keepAlive is set to 1, the server will restart transcoding if it was previously activated. :type keepAlive: int :param commandFlags: Other commands to the transcode process that are not supported by the baseline transcode command. :type commandFlags: str :link: http://docs.evostream.com/ems_api_definition/transcode
pyems/__init__.py
def transcode(self, source, destinations, **kwargs): """ Changes the compression characteristics of an audio and/or video stream. Allows you to change the resolution of a source stream, change the bitrate of a stream, change a VP8 or MPEG2 stream into H.264 and much more. Allow users to create overlays on the final stream as well as crop streams. :param source: Can be a URI or a local stream name from EMS. :type source: str :param destinations: The target URI(s) or stream name(s) of the transcoded stream. If only a name is given, it will be pushed back to the EMS. :type destinations: str :param targetStreamNames: The name of the stream(s) at destination(s). If not specified, and a full URI is provided to destinations, name will have a time stamped value. :type targetStreamNames: str :param groupName: The group name assigned to this process. If not specified, groupName will have a random value. :type groupName: str :param videoBitrates: Target output video bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no video. :type videoBitrates: str :param videoSizes: Target output video size(s) in wxh (width x height) format. IE: 240x480. :type videoSizes: str :param videoAdvancedParamsProfiles: Name of video profile template that will be used. :type videoAdvancedParamsProfiles: str :param audioBitrates: Target output audio bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no audio. :type audioBitrates: str :param audioChannelsCounts: Target output audio channel(s) count(s). Valid values are 1 (mono), 2 (stereo), and so on. Actual supported channel count is dependent on the number of input audio channels. :type audioChannelsCounts: str :param audioFrequencies: Target output audio frequency(ies) (in Hz, append `k` to value for kHz). :type audioFrequencies: str :param audioAdvancedParamsProfiles: Name of audio profile template that will be used. :type audioAdvancedParamsProfiles: str :param overlays: Location of the overlay source(s) to be used. These are transparent images (normally in PNG format) that have the same or smaller size than the video. Image is placed at the top-left position of the video. :type overlays: str :param croppings: Target video cropping position(s) and size(s) in `left : top : width : height` format (e.g. 0:0:200:100. Positions are optional (200:100 for a centered cropping of 200 width and 100 height in pixels). Values are limited to the actual size of the video. :type croppings: str :param keepAlive: If keepAlive is set to 1, the server will restart transcoding if it was previously activated. :type keepAlive: int :param commandFlags: Other commands to the transcode process that are not supported by the baseline transcode command. :type commandFlags: str :link: http://docs.evostream.com/ems_api_definition/transcode """ return self.protocol.execute('transcode', source=source, destinations=destinations, **kwargs)
def transcode(self, source, destinations, **kwargs): """ Changes the compression characteristics of an audio and/or video stream. Allows you to change the resolution of a source stream, change the bitrate of a stream, change a VP8 or MPEG2 stream into H.264 and much more. Allow users to create overlays on the final stream as well as crop streams. :param source: Can be a URI or a local stream name from EMS. :type source: str :param destinations: The target URI(s) or stream name(s) of the transcoded stream. If only a name is given, it will be pushed back to the EMS. :type destinations: str :param targetStreamNames: The name of the stream(s) at destination(s). If not specified, and a full URI is provided to destinations, name will have a time stamped value. :type targetStreamNames: str :param groupName: The group name assigned to this process. If not specified, groupName will have a random value. :type groupName: str :param videoBitrates: Target output video bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no video. :type videoBitrates: str :param videoSizes: Target output video size(s) in wxh (width x height) format. IE: 240x480. :type videoSizes: str :param videoAdvancedParamsProfiles: Name of video profile template that will be used. :type videoAdvancedParamsProfiles: str :param audioBitrates: Target output audio bitrate(s) (in bits/s, append `k` to value for kbits/s). Accepts the value `copy` to copy the input bitrate. An empty value passed would mean no audio. :type audioBitrates: str :param audioChannelsCounts: Target output audio channel(s) count(s). Valid values are 1 (mono), 2 (stereo), and so on. Actual supported channel count is dependent on the number of input audio channels. :type audioChannelsCounts: str :param audioFrequencies: Target output audio frequency(ies) (in Hz, append `k` to value for kHz). :type audioFrequencies: str :param audioAdvancedParamsProfiles: Name of audio profile template that will be used. :type audioAdvancedParamsProfiles: str :param overlays: Location of the overlay source(s) to be used. These are transparent images (normally in PNG format) that have the same or smaller size than the video. Image is placed at the top-left position of the video. :type overlays: str :param croppings: Target video cropping position(s) and size(s) in `left : top : width : height` format (e.g. 0:0:200:100. Positions are optional (200:100 for a centered cropping of 200 width and 100 height in pixels). Values are limited to the actual size of the video. :type croppings: str :param keepAlive: If keepAlive is set to 1, the server will restart transcoding if it was previously activated. :type keepAlive: int :param commandFlags: Other commands to the transcode process that are not supported by the baseline transcode command. :type commandFlags: str :link: http://docs.evostream.com/ems_api_definition/transcode """ return self.protocol.execute('transcode', source=source, destinations=destinations, **kwargs)
[ "Changes", "the", "compression", "characteristics", "of", "an", "audio", "and", "/", "or", "video", "stream", ".", "Allows", "you", "to", "change", "the", "resolution", "of", "a", "source", "stream", "change", "the", "bitrate", "of", "a", "stream", "change", "a", "VP8", "or", "MPEG2", "stream", "into", "H", ".", "264", "and", "much", "more", ".", "Allow", "users", "to", "create", "overlays", "on", "the", "final", "stream", "as", "well", "as", "crop", "streams", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L642-L722
[ "def", "transcode", "(", "self", ",", "source", ",", "destinations", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'transcode'", ",", "source", "=", "source", ",", "destinations", "=", "destinations", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.add_stream_alias
Allows you to create secondary name(s) for internal streams. Once an alias is created the localstreamname cannot be used to request playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param localStreamName: The original stream name :type localStreamName: str :param aliasName: The alias alternative to the localStreamName :type aliasName: str :param expirePeriod: The expiration period for this alias. Negative values will be treated as one-shot but no longer than the absolute positive value in seconds, 0 means it will not expire, positive values mean the alias can be used multiple times but expires after this many seconds. The default is -600 (one-shot, 10 mins) :type expirePeriod: int :link: http://docs.evostream.com/ems_api_definition/addstreamalias
pyems/__init__.py
def add_stream_alias(self, localStreamName, aliasName, **kwargs): """ Allows you to create secondary name(s) for internal streams. Once an alias is created the localstreamname cannot be used to request playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param localStreamName: The original stream name :type localStreamName: str :param aliasName: The alias alternative to the localStreamName :type aliasName: str :param expirePeriod: The expiration period for this alias. Negative values will be treated as one-shot but no longer than the absolute positive value in seconds, 0 means it will not expire, positive values mean the alias can be used multiple times but expires after this many seconds. The default is -600 (one-shot, 10 mins) :type expirePeriod: int :link: http://docs.evostream.com/ems_api_definition/addstreamalias """ return self.protocol.execute('addStreamAlias', localStreamName=localStreamName, aliasName=aliasName, **kwargs)
def add_stream_alias(self, localStreamName, aliasName, **kwargs): """ Allows you to create secondary name(s) for internal streams. Once an alias is created the localstreamname cannot be used to request playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param localStreamName: The original stream name :type localStreamName: str :param aliasName: The alias alternative to the localStreamName :type aliasName: str :param expirePeriod: The expiration period for this alias. Negative values will be treated as one-shot but no longer than the absolute positive value in seconds, 0 means it will not expire, positive values mean the alias can be used multiple times but expires after this many seconds. The default is -600 (one-shot, 10 mins) :type expirePeriod: int :link: http://docs.evostream.com/ems_api_definition/addstreamalias """ return self.protocol.execute('addStreamAlias', localStreamName=localStreamName, aliasName=aliasName, **kwargs)
[ "Allows", "you", "to", "create", "secondary", "name", "(", "s", ")", "for", "internal", "streams", ".", "Once", "an", "alias", "is", "created", "the", "localstreamname", "cannot", "be", "used", "to", "request", "playback", "of", "that", "stream", ".", "Once", "an", "alias", "is", "used", "(", "requested", "by", "a", "client", ")", "the", "alias", "is", "removed", ".", "Aliases", "are", "designed", "to", "be", "used", "to", "protect", "/", "hide", "your", "source", "streams", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L869-L894
[ "def", "add_stream_alias", "(", "self", ",", "localStreamName", ",", "aliasName", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'addStreamAlias'", ",", "localStreamName", "=", "localStreamName", ",", "aliasName", "=", "aliasName", ",", "*", "*", "kwargs", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.add_group_name_alias
Creates secondary name(s) for group names. Once an alias is created the group name cannot be used to request HTTP playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param groupName: The original group name :type groupName: str :param aliasName: The alias alternative to the group name :type aliasName: str :link: http://docs.evostream.com/ems_api_definition/addgroupnamealias
pyems/__init__.py
def add_group_name_alias(self, groupName, aliasName): """ Creates secondary name(s) for group names. Once an alias is created the group name cannot be used to request HTTP playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param groupName: The original group name :type groupName: str :param aliasName: The alias alternative to the group name :type aliasName: str :link: http://docs.evostream.com/ems_api_definition/addgroupnamealias """ return self.protocol.execute('addGroupNameAlias', groupName=groupName, aliasName=aliasName)
def add_group_name_alias(self, groupName, aliasName): """ Creates secondary name(s) for group names. Once an alias is created the group name cannot be used to request HTTP playback of that stream. Once an alias is used (requested by a client) the alias is removed. Aliases are designed to be used to protect/hide your source streams. :param groupName: The original group name :type groupName: str :param aliasName: The alias alternative to the group name :type aliasName: str :link: http://docs.evostream.com/ems_api_definition/addgroupnamealias """ return self.protocol.execute('addGroupNameAlias', groupName=groupName, aliasName=aliasName)
[ "Creates", "secondary", "name", "(", "s", ")", "for", "group", "names", ".", "Once", "an", "alias", "is", "created", "the", "group", "name", "cannot", "be", "used", "to", "request", "HTTP", "playback", "of", "that", "stream", ".", "Once", "an", "alias", "is", "used", "(", "requested", "by", "a", "client", ")", "the", "alias", "is", "removed", ".", "Aliases", "are", "designed", "to", "be", "used", "to", "protect", "/", "hide", "your", "source", "streams", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L925-L941
[ "def", "add_group_name_alias", "(", "self", ",", "groupName", ",", "aliasName", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'addGroupNameAlias'", ",", "groupName", "=", "groupName", ",", "aliasName", "=", "aliasName", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.create_ingest_point
Creates an RTMP ingest point, which mandates that streams pushed into the EMS have a target stream name which matches one Ingest Point privateStreamName. :param privateStreamName: The name that RTMP Target Stream Names must match. :type privateStreamName: str :param publicStreamName: The name that is used to access the stream pushed to the privateStreamName. The publicStreamName becomes the streams localStreamName. :type publicStreamName: str :link: http://docs.evostream.com/ems_api_definition/createingestpoint
pyems/__init__.py
def create_ingest_point(self, privateStreamName, publicStreamName): """ Creates an RTMP ingest point, which mandates that streams pushed into the EMS have a target stream name which matches one Ingest Point privateStreamName. :param privateStreamName: The name that RTMP Target Stream Names must match. :type privateStreamName: str :param publicStreamName: The name that is used to access the stream pushed to the privateStreamName. The publicStreamName becomes the streams localStreamName. :type publicStreamName: str :link: http://docs.evostream.com/ems_api_definition/createingestpoint """ return self.protocol.execute('createIngestPoint', privateStreamName=privateStreamName, publicStreamName=publicStreamName)
def create_ingest_point(self, privateStreamName, publicStreamName): """ Creates an RTMP ingest point, which mandates that streams pushed into the EMS have a target stream name which matches one Ingest Point privateStreamName. :param privateStreamName: The name that RTMP Target Stream Names must match. :type privateStreamName: str :param publicStreamName: The name that is used to access the stream pushed to the privateStreamName. The publicStreamName becomes the streams localStreamName. :type publicStreamName: str :link: http://docs.evostream.com/ems_api_definition/createingestpoint """ return self.protocol.execute('createIngestPoint', privateStreamName=privateStreamName, publicStreamName=publicStreamName)
[ "Creates", "an", "RTMP", "ingest", "point", "which", "mandates", "that", "streams", "pushed", "into", "the", "EMS", "have", "a", "target", "stream", "name", "which", "matches", "one", "Ingest", "Point", "privateStreamName", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L995-L1014
[ "def", "create_ingest_point", "(", "self", ",", "privateStreamName", ",", "publicStreamName", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'createIngestPoint'", ",", "privateStreamName", "=", "privateStreamName", ",", "publicStreamName", "=", "publicStreamName", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
Api.start_web_rtc
Starts a WebRTC signalling client to an ERS (Evostream Rendezvous Server). :param ersip: IP address (xx.yy.zz.xx) of ERS. :type ersip: str :param ersport: IP port of ERS. :type ersport: int :param roomId: Unique room Identifier within ERS that will be used by client browsers to connect to this EMS. :type roomId: str :link: http://docs.evostream.com/ems_api_definition/startwebrtc
pyems/__init__.py
def start_web_rtc(self, ersip, ersport, roomId): """ Starts a WebRTC signalling client to an ERS (Evostream Rendezvous Server). :param ersip: IP address (xx.yy.zz.xx) of ERS. :type ersip: str :param ersport: IP port of ERS. :type ersport: int :param roomId: Unique room Identifier within ERS that will be used by client browsers to connect to this EMS. :type roomId: str :link: http://docs.evostream.com/ems_api_definition/startwebrtc """ return self.protocol.execute('startwebrtc', ersip=ersip, ersport=ersport, roomId=roomId)
def start_web_rtc(self, ersip, ersport, roomId): """ Starts a WebRTC signalling client to an ERS (Evostream Rendezvous Server). :param ersip: IP address (xx.yy.zz.xx) of ERS. :type ersip: str :param ersport: IP port of ERS. :type ersport: int :param roomId: Unique room Identifier within ERS that will be used by client browsers to connect to this EMS. :type roomId: str :link: http://docs.evostream.com/ems_api_definition/startwebrtc """ return self.protocol.execute('startwebrtc', ersip=ersip, ersport=ersport, roomId=roomId)
[ "Starts", "a", "WebRTC", "signalling", "client", "to", "an", "ERS", "(", "Evostream", "Rendezvous", "Server", ")", "." ]
tomi77/pyems
python
https://github.com/tomi77/pyems/blob/8c0748b720d389f19d5226fdcceedc26cd6284ee/pyems/__init__.py#L1038-L1056
[ "def", "start_web_rtc", "(", "self", ",", "ersip", ",", "ersport", ",", "roomId", ")", ":", "return", "self", ".", "protocol", ".", "execute", "(", "'startwebrtc'", ",", "ersip", "=", "ersip", ",", "ersport", "=", "ersport", ",", "roomId", "=", "roomId", ")" ]
8c0748b720d389f19d5226fdcceedc26cd6284ee
valid
instantiate
Instantiate the generator and filename specification
dgitcore/datasets/transformation.py
def instantiate(repo, name=None, filename=None): """ Instantiate the generator and filename specification """ default_transformers = repo.options.get('transformer', {}) # If a name is specified, then lookup the options from dgit.json # if specfied. Otherwise it is initialized to an empty list of # files. transformers = {} if name is not None: # Handle the case generator is specified.. if name in default_transformers: transformers = { name : default_transformers[name] } else: transformers = { name : { 'files': [], } } else: transformers = default_transformers #========================================= # Map the filename patterns to list of files #========================================= # Instantiate the files from the patterns specified input_matching_files = None if filename is not None: input_matching_files = repo.find_matching_files([filename]) for t in transformers: for k in transformers[t]: if "files" not in k: continue if k == "files" and input_matching_files is not None: # Use the files specified on the command line.. transformers[t][k] = input_matching_files else: # Try to match the specification if transformers[t][k] is None or len(transformers[t][k]) == 0: transformers[t][k] = [] else: matching_files = repo.find_matching_files(transformers[t][k]) transformers[t][k] = matching_files return transformers
def instantiate(repo, name=None, filename=None): """ Instantiate the generator and filename specification """ default_transformers = repo.options.get('transformer', {}) # If a name is specified, then lookup the options from dgit.json # if specfied. Otherwise it is initialized to an empty list of # files. transformers = {} if name is not None: # Handle the case generator is specified.. if name in default_transformers: transformers = { name : default_transformers[name] } else: transformers = { name : { 'files': [], } } else: transformers = default_transformers #========================================= # Map the filename patterns to list of files #========================================= # Instantiate the files from the patterns specified input_matching_files = None if filename is not None: input_matching_files = repo.find_matching_files([filename]) for t in transformers: for k in transformers[t]: if "files" not in k: continue if k == "files" and input_matching_files is not None: # Use the files specified on the command line.. transformers[t][k] = input_matching_files else: # Try to match the specification if transformers[t][k] is None or len(transformers[t][k]) == 0: transformers[t][k] = [] else: matching_files = repo.find_matching_files(transformers[t][k]) transformers[t][k] = matching_files return transformers
[ "Instantiate", "the", "generator", "and", "filename", "specification" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/datasets/transformation.py#L16-L65
[ "def", "instantiate", "(", "repo", ",", "name", "=", "None", ",", "filename", "=", "None", ")", ":", "default_transformers", "=", "repo", ".", "options", ".", "get", "(", "'transformer'", ",", "{", "}", ")", "# If a name is specified, then lookup the options from dgit.json", "# if specfied. Otherwise it is initialized to an empty list of", "# files.", "transformers", "=", "{", "}", "if", "name", "is", "not", "None", ":", "# Handle the case generator is specified..", "if", "name", "in", "default_transformers", ":", "transformers", "=", "{", "name", ":", "default_transformers", "[", "name", "]", "}", "else", ":", "transformers", "=", "{", "name", ":", "{", "'files'", ":", "[", "]", ",", "}", "}", "else", ":", "transformers", "=", "default_transformers", "#=========================================", "# Map the filename patterns to list of files", "#=========================================", "# Instantiate the files from the patterns specified", "input_matching_files", "=", "None", "if", "filename", "is", "not", "None", ":", "input_matching_files", "=", "repo", ".", "find_matching_files", "(", "[", "filename", "]", ")", "for", "t", "in", "transformers", ":", "for", "k", "in", "transformers", "[", "t", "]", ":", "if", "\"files\"", "not", "in", "k", ":", "continue", "if", "k", "==", "\"files\"", "and", "input_matching_files", "is", "not", "None", ":", "# Use the files specified on the command line..", "transformers", "[", "t", "]", "[", "k", "]", "=", "input_matching_files", "else", ":", "# Try to match the specification", "if", "transformers", "[", "t", "]", "[", "k", "]", "is", "None", "or", "len", "(", "transformers", "[", "t", "]", "[", "k", "]", ")", "==", "0", ":", "transformers", "[", "t", "]", "[", "k", "]", "=", "[", "]", "else", ":", "matching_files", "=", "repo", ".", "find_matching_files", "(", "transformers", "[", "t", "]", "[", "k", "]", ")", "transformers", "[", "t", "]", "[", "k", "]", "=", "matching_files", "return", "transformers" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
transform
Materialize queries/other content within the repo. Parameters ---------- repo: Repository object name: Name of transformer, if any. If none, then all transformers specified in dgit.json will be included. filename: Pattern that specifies files that must be processed by the generators selected. If none, then the default specification in dgit.json is used.
dgitcore/datasets/transformation.py
def transform(repo, name=None, filename=None, force=False, args=[]): """ Materialize queries/other content within the repo. Parameters ---------- repo: Repository object name: Name of transformer, if any. If none, then all transformers specified in dgit.json will be included. filename: Pattern that specifies files that must be processed by the generators selected. If none, then the default specification in dgit.json is used. """ mgr = plugins_get_mgr() # Expand the specification. Now we have full file paths specs = instantiate(repo, name, filename) # Run the validators with rules files... allresults = [] for s in specs: keys = mgr.search(what='transformer',name=s)['transformer'] for k in keys: t = mgr.get_by_key('transformer', k) result = t.evaluate(repo, specs[s], force, args) allresults.extend(result) return allresults
def transform(repo, name=None, filename=None, force=False, args=[]): """ Materialize queries/other content within the repo. Parameters ---------- repo: Repository object name: Name of transformer, if any. If none, then all transformers specified in dgit.json will be included. filename: Pattern that specifies files that must be processed by the generators selected. If none, then the default specification in dgit.json is used. """ mgr = plugins_get_mgr() # Expand the specification. Now we have full file paths specs = instantiate(repo, name, filename) # Run the validators with rules files... allresults = [] for s in specs: keys = mgr.search(what='transformer',name=s)['transformer'] for k in keys: t = mgr.get_by_key('transformer', k) result = t.evaluate(repo, specs[s], force, args) allresults.extend(result) return allresults
[ "Materialize", "queries", "/", "other", "content", "within", "the", "repo", "." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/datasets/transformation.py#L67-L100
[ "def", "transform", "(", "repo", ",", "name", "=", "None", ",", "filename", "=", "None", ",", "force", "=", "False", ",", "args", "=", "[", "]", ")", ":", "mgr", "=", "plugins_get_mgr", "(", ")", "# Expand the specification. Now we have full file paths", "specs", "=", "instantiate", "(", "repo", ",", "name", ",", "filename", ")", "# Run the validators with rules files...", "allresults", "=", "[", "]", "for", "s", "in", "specs", ":", "keys", "=", "mgr", ".", "search", "(", "what", "=", "'transformer'", ",", "name", "=", "s", ")", "[", "'transformer'", "]", "for", "k", "in", "keys", ":", "t", "=", "mgr", ".", "get_by_key", "(", "'transformer'", ",", "k", ")", "result", "=", "t", ".", "evaluate", "(", "repo", ",", "specs", "[", "s", "]", ",", "force", ",", "args", ")", "allresults", ".", "extend", "(", "result", ")", "return", "allresults" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager._run
Helper function to run commands Parameters ---------- cmd : list Arguments to git command
dgitcore/contrib/repomanagers/gitmanager.py
def _run(self, cmd): """ Helper function to run commands Parameters ---------- cmd : list Arguments to git command """ # This is here in case the .gitconfig is not accessible for # some reason. environ = os.environ.copy() environ['GIT_COMMITTER_NAME'] = self.fullname environ['GIT_COMMITTER_EMAIL'] = self.email environ['GIT_AUTHOR_NAME'] = self.fullname environ['GIT_AUTHOR_EMAIL'] = self.email cmd = [pipes.quote(c) for c in cmd] cmd = " ".join(['/usr/bin/git'] + cmd) cmd += "; exit 0" #print("Running cmd", cmd) try: output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, shell=True, env=environ) except subprocess.CalledProcessError as e: output = e.output output = output.decode('utf-8') output = output.strip() # print("Output of command", output) return output
def _run(self, cmd): """ Helper function to run commands Parameters ---------- cmd : list Arguments to git command """ # This is here in case the .gitconfig is not accessible for # some reason. environ = os.environ.copy() environ['GIT_COMMITTER_NAME'] = self.fullname environ['GIT_COMMITTER_EMAIL'] = self.email environ['GIT_AUTHOR_NAME'] = self.fullname environ['GIT_AUTHOR_EMAIL'] = self.email cmd = [pipes.quote(c) for c in cmd] cmd = " ".join(['/usr/bin/git'] + cmd) cmd += "; exit 0" #print("Running cmd", cmd) try: output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, shell=True, env=environ) except subprocess.CalledProcessError as e: output = e.output output = output.decode('utf-8') output = output.strip() # print("Output of command", output) return output
[ "Helper", "function", "to", "run", "commands" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L37-L71
[ "def", "_run", "(", "self", ",", "cmd", ")", ":", "# This is here in case the .gitconfig is not accessible for", "# some reason. ", "environ", "=", "os", ".", "environ", ".", "copy", "(", ")", "environ", "[", "'GIT_COMMITTER_NAME'", "]", "=", "self", ".", "fullname", "environ", "[", "'GIT_COMMITTER_EMAIL'", "]", "=", "self", ".", "email", "environ", "[", "'GIT_AUTHOR_NAME'", "]", "=", "self", ".", "fullname", "environ", "[", "'GIT_AUTHOR_EMAIL'", "]", "=", "self", ".", "email", "cmd", "=", "[", "pipes", ".", "quote", "(", "c", ")", "for", "c", "in", "cmd", "]", "cmd", "=", "\" \"", ".", "join", "(", "[", "'/usr/bin/git'", "]", "+", "cmd", ")", "cmd", "+=", "\"; exit 0\"", "#print(\"Running cmd\", cmd)", "try", ":", "output", "=", "subprocess", ".", "check_output", "(", "cmd", ",", "stderr", "=", "subprocess", ".", "STDOUT", ",", "shell", "=", "True", ",", "env", "=", "environ", ")", "except", "subprocess", ".", "CalledProcessError", "as", "e", ":", "output", "=", "e", ".", "output", "output", "=", "output", ".", "decode", "(", "'utf-8'", ")", "output", "=", "output", ".", "strip", "(", ")", "# print(\"Output of command\", output)", "return", "output" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager._run_generic_command
Run a generic command within the repo. Assumes that you are in the repo's root directory
dgitcore/contrib/repomanagers/gitmanager.py
def _run_generic_command(self, repo, cmd): """ Run a generic command within the repo. Assumes that you are in the repo's root directory """ result = None with cd(repo.rootdir): # Dont use sh. It is not collecting the stdout of all # child processes. output = self._run(cmd) try: result = { 'cmd': cmd, 'status': 'success', 'message': output, } except Exception as e: result = { 'cmd': cmd, 'status': 'error', 'message': str(e) } return result
def _run_generic_command(self, repo, cmd): """ Run a generic command within the repo. Assumes that you are in the repo's root directory """ result = None with cd(repo.rootdir): # Dont use sh. It is not collecting the stdout of all # child processes. output = self._run(cmd) try: result = { 'cmd': cmd, 'status': 'success', 'message': output, } except Exception as e: result = { 'cmd': cmd, 'status': 'error', 'message': str(e) } return result
[ "Run", "a", "generic", "command", "within", "the", "repo", ".", "Assumes", "that", "you", "are", "in", "the", "repo", "s", "root", "directory" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L73-L97
[ "def", "_run_generic_command", "(", "self", ",", "repo", ",", "cmd", ")", ":", "result", "=", "None", "with", "cd", "(", "repo", ".", "rootdir", ")", ":", "# Dont use sh. It is not collecting the stdout of all", "# child processes.", "output", "=", "self", ".", "_run", "(", "cmd", ")", "try", ":", "result", "=", "{", "'cmd'", ":", "cmd", ",", "'status'", ":", "'success'", ",", "'message'", ":", "output", ",", "}", "except", "Exception", "as", "e", ":", "result", "=", "{", "'cmd'", ":", "cmd", ",", "'status'", ":", "'error'", ",", "'message'", ":", "str", "(", "e", ")", "}", "return", "result" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.init
Initialize a Git repo Parameters ---------- username, reponame : Repo name is tuple (name, reponame) force: force initialization of the repo even if exists backend: backend that must be used for this (e.g. s3)
dgitcore/contrib/repomanagers/gitmanager.py
def init(self, username, reponame, force, backend=None): """ Initialize a Git repo Parameters ---------- username, reponame : Repo name is tuple (name, reponame) force: force initialization of the repo even if exists backend: backend that must be used for this (e.g. s3) """ key = self.key(username, reponame) # In local filesystem-based server, add a repo server_repodir = self.server_rootdir(username, reponame, create=False) # Force cleanup if needed if os.path.exists(server_repodir) and not force: raise RepositoryExists() if os.path.exists(server_repodir): shutil.rmtree(server_repodir) os.makedirs(server_repodir) # Initialize the repo with cd(server_repodir): git.init(".", "--bare") if backend is not None: backend.init_repo(server_repodir) # Now clone the filesystem-based repo repodir = self.rootdir(username, reponame, create=False) # Prepare it if needed if os.path.exists(repodir) and not force: raise Exception("Local repo already exists") if os.path.exists(repodir): shutil.rmtree(repodir) os.makedirs(repodir) # Now clone... with cd(os.path.dirname(repodir)): git.clone(server_repodir, '--no-hardlinks') url = server_repodir if backend is not None: url = backend.url(username, reponame) repo = Repo(username, reponame) repo.manager = self repo.remoteurl = url repo.rootdir = self.rootdir(username, reponame) self.add(repo) return repo
def init(self, username, reponame, force, backend=None): """ Initialize a Git repo Parameters ---------- username, reponame : Repo name is tuple (name, reponame) force: force initialization of the repo even if exists backend: backend that must be used for this (e.g. s3) """ key = self.key(username, reponame) # In local filesystem-based server, add a repo server_repodir = self.server_rootdir(username, reponame, create=False) # Force cleanup if needed if os.path.exists(server_repodir) and not force: raise RepositoryExists() if os.path.exists(server_repodir): shutil.rmtree(server_repodir) os.makedirs(server_repodir) # Initialize the repo with cd(server_repodir): git.init(".", "--bare") if backend is not None: backend.init_repo(server_repodir) # Now clone the filesystem-based repo repodir = self.rootdir(username, reponame, create=False) # Prepare it if needed if os.path.exists(repodir) and not force: raise Exception("Local repo already exists") if os.path.exists(repodir): shutil.rmtree(repodir) os.makedirs(repodir) # Now clone... with cd(os.path.dirname(repodir)): git.clone(server_repodir, '--no-hardlinks') url = server_repodir if backend is not None: url = backend.url(username, reponame) repo = Repo(username, reponame) repo.manager = self repo.remoteurl = url repo.rootdir = self.rootdir(username, reponame) self.add(repo) return repo
[ "Initialize", "a", "Git", "repo" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L230-L287
[ "def", "init", "(", "self", ",", "username", ",", "reponame", ",", "force", ",", "backend", "=", "None", ")", ":", "key", "=", "self", ".", "key", "(", "username", ",", "reponame", ")", "# In local filesystem-based server, add a repo", "server_repodir", "=", "self", ".", "server_rootdir", "(", "username", ",", "reponame", ",", "create", "=", "False", ")", "# Force cleanup if needed", "if", "os", ".", "path", ".", "exists", "(", "server_repodir", ")", "and", "not", "force", ":", "raise", "RepositoryExists", "(", ")", "if", "os", ".", "path", ".", "exists", "(", "server_repodir", ")", ":", "shutil", ".", "rmtree", "(", "server_repodir", ")", "os", ".", "makedirs", "(", "server_repodir", ")", "# Initialize the repo", "with", "cd", "(", "server_repodir", ")", ":", "git", ".", "init", "(", "\".\"", ",", "\"--bare\"", ")", "if", "backend", "is", "not", "None", ":", "backend", ".", "init_repo", "(", "server_repodir", ")", "# Now clone the filesystem-based repo", "repodir", "=", "self", ".", "rootdir", "(", "username", ",", "reponame", ",", "create", "=", "False", ")", "# Prepare it if needed", "if", "os", ".", "path", ".", "exists", "(", "repodir", ")", "and", "not", "force", ":", "raise", "Exception", "(", "\"Local repo already exists\"", ")", "if", "os", ".", "path", ".", "exists", "(", "repodir", ")", ":", "shutil", ".", "rmtree", "(", "repodir", ")", "os", ".", "makedirs", "(", "repodir", ")", "# Now clone...", "with", "cd", "(", "os", ".", "path", ".", "dirname", "(", "repodir", ")", ")", ":", "git", ".", "clone", "(", "server_repodir", ",", "'--no-hardlinks'", ")", "url", "=", "server_repodir", "if", "backend", "is", "not", "None", ":", "url", "=", "backend", ".", "url", "(", "username", ",", "reponame", ")", "repo", "=", "Repo", "(", "username", ",", "reponame", ")", "repo", ".", "manager", "=", "self", "repo", ".", "remoteurl", "=", "url", "repo", ".", "rootdir", "=", "self", ".", "rootdir", "(", "username", ",", "reponame", ")", "self", ".", "add", "(", "repo", ")", "return", "repo" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.clone
Clone a URL Parameters ---------- url : URL of the repo. Supports s3://, git@, http://
dgitcore/contrib/repomanagers/gitmanager.py
def clone(self, url, backend=None): """ Clone a URL Parameters ---------- url : URL of the repo. Supports s3://, git@, http:// """ # s3://bucket/git/username/repo.git username = self.username reponame = url.split("/")[-1] # with git reponame = reponame.replace(".git","") key = (username, reponame) # In local filesystem-based server, add a repo server_repodir = self.server_rootdir(username, reponame, create=False) rootdir = self.rootdir(username, reponame, create=False) if backend is None: # Backend is standard git repo (https://, git@...) with cd(os.path.dirname(rootdir)): self._run(['clone', '--no-hardlinks', url]) else: # Backend is s3 # Sync if needed. if not os.path.exists(server_repodir): # s3 -> .dgit/git/pingali/hello.git -> .dgit/datasets/pingali/hello backend.clone_repo(url, server_repodir) # After sync clone, with cd(os.path.dirname(rootdir)): self._run(['clone', '--no-hardlinks', server_repodir]) # Insert the notes push if True: configfile = os.path.join(rootdir, '.git', 'config') content = open(configfile).read() original = "fetch = +refs/heads/*:refs/remotes/origin/*" replacement ="""fetch = +refs/heads/*:refs/remotes/origin/*\n fetch = +refs/notes/*:refs/notes/*""" if "notes" not in content: content = content.replace(original, replacement) with open(configfile, 'w') as fd: fd.write(content) # Pull the notes if any as well.. with cd(rootdir): self._run(['pull','origin']) # Insert the object into the internal table we maintain... r = Repo(username, reponame) r.rootdir = rootdir r.remoteurl = url r.manager = self package = os.path.join(r.rootdir, 'datapackage.json') packagedata = open(package).read() r.package = json.JSONDecoder(object_pairs_hook=collections.OrderedDict).decode(packagedata) return self.add(r)
def clone(self, url, backend=None): """ Clone a URL Parameters ---------- url : URL of the repo. Supports s3://, git@, http:// """ # s3://bucket/git/username/repo.git username = self.username reponame = url.split("/")[-1] # with git reponame = reponame.replace(".git","") key = (username, reponame) # In local filesystem-based server, add a repo server_repodir = self.server_rootdir(username, reponame, create=False) rootdir = self.rootdir(username, reponame, create=False) if backend is None: # Backend is standard git repo (https://, git@...) with cd(os.path.dirname(rootdir)): self._run(['clone', '--no-hardlinks', url]) else: # Backend is s3 # Sync if needed. if not os.path.exists(server_repodir): # s3 -> .dgit/git/pingali/hello.git -> .dgit/datasets/pingali/hello backend.clone_repo(url, server_repodir) # After sync clone, with cd(os.path.dirname(rootdir)): self._run(['clone', '--no-hardlinks', server_repodir]) # Insert the notes push if True: configfile = os.path.join(rootdir, '.git', 'config') content = open(configfile).read() original = "fetch = +refs/heads/*:refs/remotes/origin/*" replacement ="""fetch = +refs/heads/*:refs/remotes/origin/*\n fetch = +refs/notes/*:refs/notes/*""" if "notes" not in content: content = content.replace(original, replacement) with open(configfile, 'w') as fd: fd.write(content) # Pull the notes if any as well.. with cd(rootdir): self._run(['pull','origin']) # Insert the object into the internal table we maintain... r = Repo(username, reponame) r.rootdir = rootdir r.remoteurl = url r.manager = self package = os.path.join(r.rootdir, 'datapackage.json') packagedata = open(package).read() r.package = json.JSONDecoder(object_pairs_hook=collections.OrderedDict).decode(packagedata) return self.add(r)
[ "Clone", "a", "URL" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L289-L356
[ "def", "clone", "(", "self", ",", "url", ",", "backend", "=", "None", ")", ":", "# s3://bucket/git/username/repo.git", "username", "=", "self", ".", "username", "reponame", "=", "url", ".", "split", "(", "\"/\"", ")", "[", "-", "1", "]", "# with git", "reponame", "=", "reponame", ".", "replace", "(", "\".git\"", ",", "\"\"", ")", "key", "=", "(", "username", ",", "reponame", ")", "# In local filesystem-based server, add a repo", "server_repodir", "=", "self", ".", "server_rootdir", "(", "username", ",", "reponame", ",", "create", "=", "False", ")", "rootdir", "=", "self", ".", "rootdir", "(", "username", ",", "reponame", ",", "create", "=", "False", ")", "if", "backend", "is", "None", ":", "# Backend is standard git repo (https://, git@...)", "with", "cd", "(", "os", ".", "path", ".", "dirname", "(", "rootdir", ")", ")", ":", "self", ".", "_run", "(", "[", "'clone'", ",", "'--no-hardlinks'", ",", "url", "]", ")", "else", ":", "# Backend is s3", "# Sync if needed.", "if", "not", "os", ".", "path", ".", "exists", "(", "server_repodir", ")", ":", "# s3 -> .dgit/git/pingali/hello.git -> .dgit/datasets/pingali/hello", "backend", ".", "clone_repo", "(", "url", ",", "server_repodir", ")", "# After sync clone,", "with", "cd", "(", "os", ".", "path", ".", "dirname", "(", "rootdir", ")", ")", ":", "self", ".", "_run", "(", "[", "'clone'", ",", "'--no-hardlinks'", ",", "server_repodir", "]", ")", "# Insert the notes push", "if", "True", ":", "configfile", "=", "os", ".", "path", ".", "join", "(", "rootdir", ",", "'.git'", ",", "'config'", ")", "content", "=", "open", "(", "configfile", ")", ".", "read", "(", ")", "original", "=", "\"fetch = +refs/heads/*:refs/remotes/origin/*\"", "replacement", "=", "\"\"\"fetch = +refs/heads/*:refs/remotes/origin/*\\n fetch = +refs/notes/*:refs/notes/*\"\"\"", "if", "\"notes\"", "not", "in", "content", ":", "content", "=", "content", ".", "replace", "(", "original", ",", "replacement", ")", "with", "open", "(", "configfile", ",", "'w'", ")", "as", "fd", ":", "fd", ".", "write", "(", "content", ")", "# Pull the notes if any as well..", "with", "cd", "(", "rootdir", ")", ":", "self", ".", "_run", "(", "[", "'pull'", ",", "'origin'", "]", ")", "# Insert the object into the internal table we maintain...", "r", "=", "Repo", "(", "username", ",", "reponame", ")", "r", ".", "rootdir", "=", "rootdir", "r", ".", "remoteurl", "=", "url", "r", ".", "manager", "=", "self", "package", "=", "os", ".", "path", ".", "join", "(", "r", ".", "rootdir", ",", "'datapackage.json'", ")", "packagedata", "=", "open", "(", "package", ")", ".", "read", "(", ")", "r", ".", "package", "=", "json", ".", "JSONDecoder", "(", "object_pairs_hook", "=", "collections", ".", "OrderedDict", ")", ".", "decode", "(", "packagedata", ")", "return", "self", ".", "add", "(", "r", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.delete
Delete files from the repo
dgitcore/contrib/repomanagers/gitmanager.py
def delete(self, repo, args=[]): """ Delete files from the repo """ result = None with cd(repo.rootdir): try: cmd = ['rm'] + list(args) result = { 'status': 'success', 'message': self._run(cmd) } except Exception as e: result = { 'status': 'error', 'message': str(e) } # print(result) return result
def delete(self, repo, args=[]): """ Delete files from the repo """ result = None with cd(repo.rootdir): try: cmd = ['rm'] + list(args) result = { 'status': 'success', 'message': self._run(cmd) } except Exception as e: result = { 'status': 'error', 'message': str(e) } # print(result) return result
[ "Delete", "files", "from", "the", "repo" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L359-L379
[ "def", "delete", "(", "self", ",", "repo", ",", "args", "=", "[", "]", ")", ":", "result", "=", "None", "with", "cd", "(", "repo", ".", "rootdir", ")", ":", "try", ":", "cmd", "=", "[", "'rm'", "]", "+", "list", "(", "args", ")", "result", "=", "{", "'status'", ":", "'success'", ",", "'message'", ":", "self", ".", "_run", "(", "cmd", ")", "}", "except", "Exception", "as", "e", ":", "result", "=", "{", "'status'", ":", "'error'", ",", "'message'", ":", "str", "(", "e", ")", "}", "# print(result)", "return", "result" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.drop
Cleanup the repo
dgitcore/contrib/repomanagers/gitmanager.py
def drop(self, repo, args=[]): """ Cleanup the repo """ # Clean up the rootdir rootdir = repo.rootdir if os.path.exists(rootdir): print("Cleaning repo directory: {}".format(rootdir)) shutil.rmtree(rootdir) # Cleanup the local version of the repo (this could be on # the server etc. server_repodir = self.server_rootdir_from_repo(repo, create=False) if os.path.exists(server_repodir): print("Cleaning data from local git 'server': {}".format(server_repodir)) shutil.rmtree(server_repodir) super(GitRepoManager, self).drop(repo) return { 'status': 'success', 'message': "successful cleanup" }
def drop(self, repo, args=[]): """ Cleanup the repo """ # Clean up the rootdir rootdir = repo.rootdir if os.path.exists(rootdir): print("Cleaning repo directory: {}".format(rootdir)) shutil.rmtree(rootdir) # Cleanup the local version of the repo (this could be on # the server etc. server_repodir = self.server_rootdir_from_repo(repo, create=False) if os.path.exists(server_repodir): print("Cleaning data from local git 'server': {}".format(server_repodir)) shutil.rmtree(server_repodir) super(GitRepoManager, self).drop(repo) return { 'status': 'success', 'message': "successful cleanup" }
[ "Cleanup", "the", "repo" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L381-L405
[ "def", "drop", "(", "self", ",", "repo", ",", "args", "=", "[", "]", ")", ":", "# Clean up the rootdir", "rootdir", "=", "repo", ".", "rootdir", "if", "os", ".", "path", ".", "exists", "(", "rootdir", ")", ":", "print", "(", "\"Cleaning repo directory: {}\"", ".", "format", "(", "rootdir", ")", ")", "shutil", ".", "rmtree", "(", "rootdir", ")", "# Cleanup the local version of the repo (this could be on", "# the server etc.", "server_repodir", "=", "self", ".", "server_rootdir_from_repo", "(", "repo", ",", "create", "=", "False", ")", "if", "os", ".", "path", ".", "exists", "(", "server_repodir", ")", ":", "print", "(", "\"Cleaning data from local git 'server': {}\"", ".", "format", "(", "server_repodir", ")", ")", "shutil", ".", "rmtree", "(", "server_repodir", ")", "super", "(", "GitRepoManager", ",", "self", ")", ".", "drop", "(", "repo", ")", "return", "{", "'status'", ":", "'success'", ",", "'message'", ":", "\"successful cleanup\"", "}" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.permalink
Get the permalink to command that generated the dataset
dgitcore/contrib/repomanagers/gitmanager.py
def permalink(self, repo, path): """ Get the permalink to command that generated the dataset """ if not os.path.exists(path): # print("Path does not exist", path) return (None, None) # Get this directory cwd = os.getcwd() # Find the root of the repo and cd into that directory.. if os.path.isfile(path): os.chdir(os.path.dirname(path)) rootdir = self._run(["rev-parse", "--show-toplevel"]) if "fatal" in rootdir: # print("fatal", rootdir) return (None, None) os.chdir(rootdir) # print("Rootdir = ", rootdir) # Now find relative path relpath = os.path.relpath(path, rootdir) # print("relpath = ", relpath) # Get the last commit for this file #3764cc2600b221ac7d7497de3d0dbcb4cffa2914 sha1 = self._run(["log", "-n", "1", "--format=format:%H", relpath]) # print("sha1 = ", sha1) # Get the repo URL #git@gitlab.com:pingali/simple-regression.git #https://gitlab.com/kanban_demo/test_project.git remoteurl = self._run(["config", "--get", "remote.origin.url"]) # print("remoteurl = ", remoteurl) # Go back to the original directory... os.chdir(cwd) # Now match it against two possible formats of the remote url # Examples #https://help.github.com/articles/getting-permanent-links-to-files/ #https://github.com/github/hubot/blob/ed25584f5ac2520a6c28547ffd0961c7abd7ea49/README.md #https://gitlab.com/pingali/simple-regression/blob/3764cc2600b221ac7d7497de3d0dbcb4cffa2914/model.py #https://github.com/pingali/dgit/blob/ff91b5d04b2978cad0bf9b006d1b0a16d18a778e/README.rst #https://gitlab.com/kanban_demo/test_project/blob/b004677c23b3a31eb7b5588a5194857b2c8b2b95/README.md m = re.search('^git@([^:\/]+):([^/]+)/([^/]+)', remoteurl) if m is None: m = re.search('^https://([^:/]+)/([^/]+)/([^/]+)', remoteurl) if m is not None: domain = m.group(1) username = m.group(2) project = m.group(3) if project.endswith(".git"): project = project[:-4] permalink = "https://{}/{}/{}/blob/{}/{}".format(domain, username, project, sha1, relpath) # print("permalink = ", permalink) return (relpath, permalink) else: return (None, None)
def permalink(self, repo, path): """ Get the permalink to command that generated the dataset """ if not os.path.exists(path): # print("Path does not exist", path) return (None, None) # Get this directory cwd = os.getcwd() # Find the root of the repo and cd into that directory.. if os.path.isfile(path): os.chdir(os.path.dirname(path)) rootdir = self._run(["rev-parse", "--show-toplevel"]) if "fatal" in rootdir: # print("fatal", rootdir) return (None, None) os.chdir(rootdir) # print("Rootdir = ", rootdir) # Now find relative path relpath = os.path.relpath(path, rootdir) # print("relpath = ", relpath) # Get the last commit for this file #3764cc2600b221ac7d7497de3d0dbcb4cffa2914 sha1 = self._run(["log", "-n", "1", "--format=format:%H", relpath]) # print("sha1 = ", sha1) # Get the repo URL #git@gitlab.com:pingali/simple-regression.git #https://gitlab.com/kanban_demo/test_project.git remoteurl = self._run(["config", "--get", "remote.origin.url"]) # print("remoteurl = ", remoteurl) # Go back to the original directory... os.chdir(cwd) # Now match it against two possible formats of the remote url # Examples #https://help.github.com/articles/getting-permanent-links-to-files/ #https://github.com/github/hubot/blob/ed25584f5ac2520a6c28547ffd0961c7abd7ea49/README.md #https://gitlab.com/pingali/simple-regression/blob/3764cc2600b221ac7d7497de3d0dbcb4cffa2914/model.py #https://github.com/pingali/dgit/blob/ff91b5d04b2978cad0bf9b006d1b0a16d18a778e/README.rst #https://gitlab.com/kanban_demo/test_project/blob/b004677c23b3a31eb7b5588a5194857b2c8b2b95/README.md m = re.search('^git@([^:\/]+):([^/]+)/([^/]+)', remoteurl) if m is None: m = re.search('^https://([^:/]+)/([^/]+)/([^/]+)', remoteurl) if m is not None: domain = m.group(1) username = m.group(2) project = m.group(3) if project.endswith(".git"): project = project[:-4] permalink = "https://{}/{}/{}/blob/{}/{}".format(domain, username, project, sha1, relpath) # print("permalink = ", permalink) return (relpath, permalink) else: return (None, None)
[ "Get", "the", "permalink", "to", "command", "that", "generated", "the", "dataset" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L407-L471
[ "def", "permalink", "(", "self", ",", "repo", ",", "path", ")", ":", "if", "not", "os", ".", "path", ".", "exists", "(", "path", ")", ":", "# print(\"Path does not exist\", path)", "return", "(", "None", ",", "None", ")", "# Get this directory", "cwd", "=", "os", ".", "getcwd", "(", ")", "# Find the root of the repo and cd into that directory..", "if", "os", ".", "path", ".", "isfile", "(", "path", ")", ":", "os", ".", "chdir", "(", "os", ".", "path", ".", "dirname", "(", "path", ")", ")", "rootdir", "=", "self", ".", "_run", "(", "[", "\"rev-parse\"", ",", "\"--show-toplevel\"", "]", ")", "if", "\"fatal\"", "in", "rootdir", ":", "# print(\"fatal\", rootdir)", "return", "(", "None", ",", "None", ")", "os", ".", "chdir", "(", "rootdir", ")", "# print(\"Rootdir = \", rootdir)", "# Now find relative path", "relpath", "=", "os", ".", "path", ".", "relpath", "(", "path", ",", "rootdir", ")", "# print(\"relpath = \", relpath)", "# Get the last commit for this file", "#3764cc2600b221ac7d7497de3d0dbcb4cffa2914", "sha1", "=", "self", ".", "_run", "(", "[", "\"log\"", ",", "\"-n\"", ",", "\"1\"", ",", "\"--format=format:%H\"", ",", "relpath", "]", ")", "# print(\"sha1 = \", sha1)", "# Get the repo URL", "#git@gitlab.com:pingali/simple-regression.git", "#https://gitlab.com/kanban_demo/test_project.git", "remoteurl", "=", "self", ".", "_run", "(", "[", "\"config\"", ",", "\"--get\"", ",", "\"remote.origin.url\"", "]", ")", "# print(\"remoteurl = \", remoteurl)", "# Go back to the original directory...", "os", ".", "chdir", "(", "cwd", ")", "# Now match it against two possible formats of the remote url", "# Examples", "#https://help.github.com/articles/getting-permanent-links-to-files/", "#https://github.com/github/hubot/blob/ed25584f5ac2520a6c28547ffd0961c7abd7ea49/README.md", "#https://gitlab.com/pingali/simple-regression/blob/3764cc2600b221ac7d7497de3d0dbcb4cffa2914/model.py", "#https://github.com/pingali/dgit/blob/ff91b5d04b2978cad0bf9b006d1b0a16d18a778e/README.rst", "#https://gitlab.com/kanban_demo/test_project/blob/b004677c23b3a31eb7b5588a5194857b2c8b2b95/README.md", "m", "=", "re", ".", "search", "(", "'^git@([^:\\/]+):([^/]+)/([^/]+)'", ",", "remoteurl", ")", "if", "m", "is", "None", ":", "m", "=", "re", ".", "search", "(", "'^https://([^:/]+)/([^/]+)/([^/]+)'", ",", "remoteurl", ")", "if", "m", "is", "not", "None", ":", "domain", "=", "m", ".", "group", "(", "1", ")", "username", "=", "m", ".", "group", "(", "2", ")", "project", "=", "m", ".", "group", "(", "3", ")", "if", "project", ".", "endswith", "(", "\".git\"", ")", ":", "project", "=", "project", "[", ":", "-", "4", "]", "permalink", "=", "\"https://{}/{}/{}/blob/{}/{}\"", ".", "format", "(", "domain", ",", "username", ",", "project", ",", "sha1", ",", "relpath", ")", "# print(\"permalink = \", permalink)", "return", "(", "relpath", ",", "permalink", ")", "else", ":", "return", "(", "None", ",", "None", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.add_files
Add files to the repo
dgitcore/contrib/repomanagers/gitmanager.py
def add_files(self, repo, files): """ Add files to the repo """ rootdir = repo.rootdir for f in files: relativepath = f['relativepath'] sourcepath = f['localfullpath'] if sourcepath is None: # This can happen if the relative path is a URL continue # # Prepare the target path targetpath = os.path.join(rootdir, relativepath) try: os.makedirs(os.path.dirname(targetpath)) except: pass # print(sourcepath," => ", targetpath) print("Updating: {}".format(relativepath)) shutil.copyfile(sourcepath, targetpath) with cd(repo.rootdir): self._run(['add', relativepath])
def add_files(self, repo, files): """ Add files to the repo """ rootdir = repo.rootdir for f in files: relativepath = f['relativepath'] sourcepath = f['localfullpath'] if sourcepath is None: # This can happen if the relative path is a URL continue # # Prepare the target path targetpath = os.path.join(rootdir, relativepath) try: os.makedirs(os.path.dirname(targetpath)) except: pass # print(sourcepath," => ", targetpath) print("Updating: {}".format(relativepath)) shutil.copyfile(sourcepath, targetpath) with cd(repo.rootdir): self._run(['add', relativepath])
[ "Add", "files", "to", "the", "repo" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L484-L505
[ "def", "add_files", "(", "self", ",", "repo", ",", "files", ")", ":", "rootdir", "=", "repo", ".", "rootdir", "for", "f", "in", "files", ":", "relativepath", "=", "f", "[", "'relativepath'", "]", "sourcepath", "=", "f", "[", "'localfullpath'", "]", "if", "sourcepath", "is", "None", ":", "# This can happen if the relative path is a URL", "continue", "#", "# Prepare the target path", "targetpath", "=", "os", ".", "path", ".", "join", "(", "rootdir", ",", "relativepath", ")", "try", ":", "os", ".", "makedirs", "(", "os", ".", "path", ".", "dirname", "(", "targetpath", ")", ")", "except", ":", "pass", "# print(sourcepath,\" => \", targetpath)", "print", "(", "\"Updating: {}\"", ".", "format", "(", "relativepath", ")", ")", "shutil", ".", "copyfile", "(", "sourcepath", ",", "targetpath", ")", "with", "cd", "(", "repo", ".", "rootdir", ")", ":", "self", ".", "_run", "(", "[", "'add'", ",", "relativepath", "]", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
GitRepoManager.config
Paramers: --------- workspace: Directory to store the dataset repositories email:
dgitcore/contrib/repomanagers/gitmanager.py
def config(self, what='get', params=None): """ Paramers: --------- workspace: Directory to store the dataset repositories email: """ if what == 'get': return { 'name': 'git', 'nature': 'repomanager', 'variables': [], } elif what == 'set': self.workspace = params['Local']['workspace'] self.workspace = os.path.abspath(self.workspace) self.username = params['User']['user.name'] self.fullname = params['User']['user.fullname'] self.email = params['User']['user.email'] repodir = os.path.join(self.workspace, 'datasets') if not os.path.exists(repodir): return for username in os.listdir(repodir): for reponame in os.listdir(os.path.join(repodir, username)): if self.is_my_repo(username, reponame): r = Repo(username, reponame) r.rootdir = os.path.join(repodir, username, reponame) package = os.path.join(r.rootdir, 'datapackage.json') if not os.path.exists(package): print("datapackage.json does not exist in dataset") print("Skipping: {}/{}".format(username, reponame)) continue packagedata = open(package).read() r.package = json.JSONDecoder(object_pairs_hook=collections.OrderedDict).decode(packagedata) r.manager = self self.add(r)
def config(self, what='get', params=None): """ Paramers: --------- workspace: Directory to store the dataset repositories email: """ if what == 'get': return { 'name': 'git', 'nature': 'repomanager', 'variables': [], } elif what == 'set': self.workspace = params['Local']['workspace'] self.workspace = os.path.abspath(self.workspace) self.username = params['User']['user.name'] self.fullname = params['User']['user.fullname'] self.email = params['User']['user.email'] repodir = os.path.join(self.workspace, 'datasets') if not os.path.exists(repodir): return for username in os.listdir(repodir): for reponame in os.listdir(os.path.join(repodir, username)): if self.is_my_repo(username, reponame): r = Repo(username, reponame) r.rootdir = os.path.join(repodir, username, reponame) package = os.path.join(r.rootdir, 'datapackage.json') if not os.path.exists(package): print("datapackage.json does not exist in dataset") print("Skipping: {}/{}".format(username, reponame)) continue packagedata = open(package).read() r.package = json.JSONDecoder(object_pairs_hook=collections.OrderedDict).decode(packagedata) r.manager = self self.add(r)
[ "Paramers", ":", "---------" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/repomanagers/gitmanager.py#L507-L546
[ "def", "config", "(", "self", ",", "what", "=", "'get'", ",", "params", "=", "None", ")", ":", "if", "what", "==", "'get'", ":", "return", "{", "'name'", ":", "'git'", ",", "'nature'", ":", "'repomanager'", ",", "'variables'", ":", "[", "]", ",", "}", "elif", "what", "==", "'set'", ":", "self", ".", "workspace", "=", "params", "[", "'Local'", "]", "[", "'workspace'", "]", "self", ".", "workspace", "=", "os", ".", "path", ".", "abspath", "(", "self", ".", "workspace", ")", "self", ".", "username", "=", "params", "[", "'User'", "]", "[", "'user.name'", "]", "self", ".", "fullname", "=", "params", "[", "'User'", "]", "[", "'user.fullname'", "]", "self", ".", "email", "=", "params", "[", "'User'", "]", "[", "'user.email'", "]", "repodir", "=", "os", ".", "path", ".", "join", "(", "self", ".", "workspace", ",", "'datasets'", ")", "if", "not", "os", ".", "path", ".", "exists", "(", "repodir", ")", ":", "return", "for", "username", "in", "os", ".", "listdir", "(", "repodir", ")", ":", "for", "reponame", "in", "os", ".", "listdir", "(", "os", ".", "path", ".", "join", "(", "repodir", ",", "username", ")", ")", ":", "if", "self", ".", "is_my_repo", "(", "username", ",", "reponame", ")", ":", "r", "=", "Repo", "(", "username", ",", "reponame", ")", "r", ".", "rootdir", "=", "os", ".", "path", ".", "join", "(", "repodir", ",", "username", ",", "reponame", ")", "package", "=", "os", ".", "path", ".", "join", "(", "r", ".", "rootdir", ",", "'datapackage.json'", ")", "if", "not", "os", ".", "path", ".", "exists", "(", "package", ")", ":", "print", "(", "\"datapackage.json does not exist in dataset\"", ")", "print", "(", "\"Skipping: {}/{}\"", ".", "format", "(", "username", ",", "reponame", ")", ")", "continue", "packagedata", "=", "open", "(", "package", ")", ".", "read", "(", ")", "r", ".", "package", "=", "json", ".", "JSONDecoder", "(", "object_pairs_hook", "=", "collections", ".", "OrderedDict", ")", ".", "decode", "(", "packagedata", ")", "r", ".", "manager", "=", "self", "self", ".", "add", "(", "r", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
Invoice._init_empty
Creates the base set of attributes invoice has/needs
holviapi/invoicing.py
def _init_empty(self): """Creates the base set of attributes invoice has/needs""" self._jsondata = { "code": None, "currency": "EUR", "subject": "", "due_date": (datetime.datetime.now().date() + datetime.timedelta(days=14)).isoformat(), "issue_date": datetime.datetime.now().date().isoformat(), "number": None, "type": "outbound", "receiver": { "name": "", "email": "", "street": "", "city": "", "postcode": "", "country": "" }, "items": [], }
def _init_empty(self): """Creates the base set of attributes invoice has/needs""" self._jsondata = { "code": None, "currency": "EUR", "subject": "", "due_date": (datetime.datetime.now().date() + datetime.timedelta(days=14)).isoformat(), "issue_date": datetime.datetime.now().date().isoformat(), "number": None, "type": "outbound", "receiver": { "name": "", "email": "", "street": "", "city": "", "postcode": "", "country": "" }, "items": [], }
[ "Creates", "the", "base", "set", "of", "attributes", "invoice", "has", "/", "needs" ]
rambo/python-holviapi
python
https://github.com/rambo/python-holviapi/blob/f57f44e7b0a1030786aafd6f387114abb546bb32/holviapi/invoicing.py#L33-L52
[ "def", "_init_empty", "(", "self", ")", ":", "self", ".", "_jsondata", "=", "{", "\"code\"", ":", "None", ",", "\"currency\"", ":", "\"EUR\"", ",", "\"subject\"", ":", "\"\"", ",", "\"due_date\"", ":", "(", "datetime", ".", "datetime", ".", "now", "(", ")", ".", "date", "(", ")", "+", "datetime", ".", "timedelta", "(", "days", "=", "14", ")", ")", ".", "isoformat", "(", ")", ",", "\"issue_date\"", ":", "datetime", ".", "datetime", ".", "now", "(", ")", ".", "date", "(", ")", ".", "isoformat", "(", ")", ",", "\"number\"", ":", "None", ",", "\"type\"", ":", "\"outbound\"", ",", "\"receiver\"", ":", "{", "\"name\"", ":", "\"\"", ",", "\"email\"", ":", "\"\"", ",", "\"street\"", ":", "\"\"", ",", "\"city\"", ":", "\"\"", ",", "\"postcode\"", ":", "\"\"", ",", "\"country\"", ":", "\"\"", "}", ",", "\"items\"", ":", "[", "]", ",", "}" ]
f57f44e7b0a1030786aafd6f387114abb546bb32
valid
Invoice.send
Marks the invoice as sent in Holvi If send_email is False then the invoice is *not* automatically emailed to the recipient and your must take care of sending the invoice yourself.
holviapi/invoicing.py
def send(self, send_email=True): """Marks the invoice as sent in Holvi If send_email is False then the invoice is *not* automatically emailed to the recipient and your must take care of sending the invoice yourself. """ url = str(self.api.base_url + '{code}/status/').format(code=self.code) # six.u messes this up payload = { 'mark_as_sent': True, 'send_email': send_email, } stat = self.api.connection.make_put(url, payload)
def send(self, send_email=True): """Marks the invoice as sent in Holvi If send_email is False then the invoice is *not* automatically emailed to the recipient and your must take care of sending the invoice yourself. """ url = str(self.api.base_url + '{code}/status/').format(code=self.code) # six.u messes this up payload = { 'mark_as_sent': True, 'send_email': send_email, } stat = self.api.connection.make_put(url, payload)
[ "Marks", "the", "invoice", "as", "sent", "in", "Holvi" ]
rambo/python-holviapi
python
https://github.com/rambo/python-holviapi/blob/f57f44e7b0a1030786aafd6f387114abb546bb32/holviapi/invoicing.py#L54-L65
[ "def", "send", "(", "self", ",", "send_email", "=", "True", ")", ":", "url", "=", "str", "(", "self", ".", "api", ".", "base_url", "+", "'{code}/status/'", ")", ".", "format", "(", "code", "=", "self", ".", "code", ")", "# six.u messes this up", "payload", "=", "{", "'mark_as_sent'", ":", "True", ",", "'send_email'", ":", "send_email", ",", "}", "stat", "=", "self", ".", "api", ".", "connection", ".", "make_put", "(", "url", ",", "payload", ")" ]
f57f44e7b0a1030786aafd6f387114abb546bb32
valid
Invoice.to_holvi_dict
Convert our Python object to JSON acceptable to Holvi API
holviapi/invoicing.py
def to_holvi_dict(self): """Convert our Python object to JSON acceptable to Holvi API""" self._jsondata["items"] = [] for item in self.items: self._jsondata["items"].append(item.to_holvi_dict()) self._jsondata["issue_date"] = self.issue_date.isoformat() self._jsondata["due_date"] = self.due_date.isoformat() self._jsondata["receiver"] = self.receiver.to_holvi_dict() return {k: v for (k, v) in self._jsondata.items() if k in self._valid_keys}
def to_holvi_dict(self): """Convert our Python object to JSON acceptable to Holvi API""" self._jsondata["items"] = [] for item in self.items: self._jsondata["items"].append(item.to_holvi_dict()) self._jsondata["issue_date"] = self.issue_date.isoformat() self._jsondata["due_date"] = self.due_date.isoformat() self._jsondata["receiver"] = self.receiver.to_holvi_dict() return {k: v for (k, v) in self._jsondata.items() if k in self._valid_keys}
[ "Convert", "our", "Python", "object", "to", "JSON", "acceptable", "to", "Holvi", "API" ]
rambo/python-holviapi
python
https://github.com/rambo/python-holviapi/blob/f57f44e7b0a1030786aafd6f387114abb546bb32/holviapi/invoicing.py#L69-L77
[ "def", "to_holvi_dict", "(", "self", ")", ":", "self", ".", "_jsondata", "[", "\"items\"", "]", "=", "[", "]", "for", "item", "in", "self", ".", "items", ":", "self", ".", "_jsondata", "[", "\"items\"", "]", ".", "append", "(", "item", ".", "to_holvi_dict", "(", ")", ")", "self", ".", "_jsondata", "[", "\"issue_date\"", "]", "=", "self", ".", "issue_date", ".", "isoformat", "(", ")", "self", ".", "_jsondata", "[", "\"due_date\"", "]", "=", "self", ".", "due_date", ".", "isoformat", "(", ")", "self", ".", "_jsondata", "[", "\"receiver\"", "]", "=", "self", ".", "receiver", ".", "to_holvi_dict", "(", ")", "return", "{", "k", ":", "v", "for", "(", "k", ",", "v", ")", "in", "self", ".", "_jsondata", ".", "items", "(", ")", "if", "k", "in", "self", ".", "_valid_keys", "}" ]
f57f44e7b0a1030786aafd6f387114abb546bb32
valid
Invoice.save
Saves this invoice to Holvi, returns the created/updated invoice
holviapi/invoicing.py
def save(self): """Saves this invoice to Holvi, returns the created/updated invoice""" if not self.items: raise HolviError("No items") if not self.subject: raise HolviError("No subject") send_json = self.to_holvi_dict() if self.code: url = str(self.api.base_url + '{code}/').format(code=self.code) if not self.code: send_patch = {k: v for (k, v) in send_json.items() if k in self._patch_valid_keys} send_patch["items"] = [] for item in self.items: send_patch["items"].append(item.to_holvi_dict(True)) stat = self.api.connection.make_patch(url, send_patch) else: stat = self.api.connection.make_put(url, send_json) return Invoice(self.api, stat) else: url = str(self.api.base_url) stat = self.api.connection.make_post(url, send_json) return Invoice(self.api, stat)
def save(self): """Saves this invoice to Holvi, returns the created/updated invoice""" if not self.items: raise HolviError("No items") if not self.subject: raise HolviError("No subject") send_json = self.to_holvi_dict() if self.code: url = str(self.api.base_url + '{code}/').format(code=self.code) if not self.code: send_patch = {k: v for (k, v) in send_json.items() if k in self._patch_valid_keys} send_patch["items"] = [] for item in self.items: send_patch["items"].append(item.to_holvi_dict(True)) stat = self.api.connection.make_patch(url, send_patch) else: stat = self.api.connection.make_put(url, send_json) return Invoice(self.api, stat) else: url = str(self.api.base_url) stat = self.api.connection.make_post(url, send_json) return Invoice(self.api, stat)
[ "Saves", "this", "invoice", "to", "Holvi", "returns", "the", "created", "/", "updated", "invoice" ]
rambo/python-holviapi
python
https://github.com/rambo/python-holviapi/blob/f57f44e7b0a1030786aafd6f387114abb546bb32/holviapi/invoicing.py#L79-L100
[ "def", "save", "(", "self", ")", ":", "if", "not", "self", ".", "items", ":", "raise", "HolviError", "(", "\"No items\"", ")", "if", "not", "self", ".", "subject", ":", "raise", "HolviError", "(", "\"No subject\"", ")", "send_json", "=", "self", ".", "to_holvi_dict", "(", ")", "if", "self", ".", "code", ":", "url", "=", "str", "(", "self", ".", "api", ".", "base_url", "+", "'{code}/'", ")", ".", "format", "(", "code", "=", "self", ".", "code", ")", "if", "not", "self", ".", "code", ":", "send_patch", "=", "{", "k", ":", "v", "for", "(", "k", ",", "v", ")", "in", "send_json", ".", "items", "(", ")", "if", "k", "in", "self", ".", "_patch_valid_keys", "}", "send_patch", "[", "\"items\"", "]", "=", "[", "]", "for", "item", "in", "self", ".", "items", ":", "send_patch", "[", "\"items\"", "]", ".", "append", "(", "item", ".", "to_holvi_dict", "(", "True", ")", ")", "stat", "=", "self", ".", "api", ".", "connection", ".", "make_patch", "(", "url", ",", "send_patch", ")", "else", ":", "stat", "=", "self", ".", "api", ".", "connection", ".", "make_put", "(", "url", ",", "send_json", ")", "return", "Invoice", "(", "self", ".", "api", ",", "stat", ")", "else", ":", "url", "=", "str", "(", "self", ".", "api", ".", "base_url", ")", "stat", "=", "self", ".", "api", ".", "connection", ".", "make_post", "(", "url", ",", "send_json", ")", "return", "Invoice", "(", "self", ".", "api", ",", "stat", ")" ]
f57f44e7b0a1030786aafd6f387114abb546bb32
valid
get_plugin_source
Returns the :class:`PluginSource` for the current module or the given module. The module can be provided by name (in which case an import will be attempted) or as a module object. If no plugin source can be discovered, the return value from this method is `None`. This function can be very useful if additional data has been attached to the plugin source. For instance this could allow plugins to get access to a back reference to the application that created them. :param module: optionally the module to locate the plugin source of. :param stacklevel: defines how many levels up the module should search for before it discovers the plugin frame. The default is 0. This can be useful for writing wrappers around this function.
dgitcore/vendor/pluginbase/pluginbase.py
def get_plugin_source(module=None, stacklevel=None): """Returns the :class:`PluginSource` for the current module or the given module. The module can be provided by name (in which case an import will be attempted) or as a module object. If no plugin source can be discovered, the return value from this method is `None`. This function can be very useful if additional data has been attached to the plugin source. For instance this could allow plugins to get access to a back reference to the application that created them. :param module: optionally the module to locate the plugin source of. :param stacklevel: defines how many levels up the module should search for before it discovers the plugin frame. The default is 0. This can be useful for writing wrappers around this function. """ if module is None: frm = sys._getframe((stacklevel or 0) + 1) name = frm.f_globals['__name__'] glob = frm.f_globals elif isinstance(module, string_types): frm = sys._getframe(1) name = module glob = __import__(module, frm.f_globals, frm.f_locals, ['__dict__']).__dict__ else: name = module.__name__ glob = module.__dict__ return _discover_space(name, glob)
def get_plugin_source(module=None, stacklevel=None): """Returns the :class:`PluginSource` for the current module or the given module. The module can be provided by name (in which case an import will be attempted) or as a module object. If no plugin source can be discovered, the return value from this method is `None`. This function can be very useful if additional data has been attached to the plugin source. For instance this could allow plugins to get access to a back reference to the application that created them. :param module: optionally the module to locate the plugin source of. :param stacklevel: defines how many levels up the module should search for before it discovers the plugin frame. The default is 0. This can be useful for writing wrappers around this function. """ if module is None: frm = sys._getframe((stacklevel or 0) + 1) name = frm.f_globals['__name__'] glob = frm.f_globals elif isinstance(module, string_types): frm = sys._getframe(1) name = module glob = __import__(module, frm.f_globals, frm.f_locals, ['__dict__']).__dict__ else: name = module.__name__ glob = module.__dict__ return _discover_space(name, glob)
[ "Returns", "the", ":", "class", ":", "PluginSource", "for", "the", "current", "module", "or", "the", "given", "module", ".", "The", "module", "can", "be", "provided", "by", "name", "(", "in", "which", "case", "an", "import", "will", "be", "attempted", ")", "or", "as", "a", "module", "object", "." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/vendor/pluginbase/pluginbase.py#L42-L72
[ "def", "get_plugin_source", "(", "module", "=", "None", ",", "stacklevel", "=", "None", ")", ":", "if", "module", "is", "None", ":", "frm", "=", "sys", ".", "_getframe", "(", "(", "stacklevel", "or", "0", ")", "+", "1", ")", "name", "=", "frm", ".", "f_globals", "[", "'__name__'", "]", "glob", "=", "frm", ".", "f_globals", "elif", "isinstance", "(", "module", ",", "string_types", ")", ":", "frm", "=", "sys", ".", "_getframe", "(", "1", ")", "name", "=", "module", "glob", "=", "__import__", "(", "module", ",", "frm", ".", "f_globals", ",", "frm", ".", "f_locals", ",", "[", "'__dict__'", "]", ")", ".", "__dict__", "else", ":", "name", "=", "module", ".", "__name__", "glob", "=", "module", ".", "__dict__", "return", "_discover_space", "(", "name", ",", "glob", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
PluginSource.list_plugins
Returns a sorted list of all plugins that are available in this plugin source. This can be useful to automatically discover plugins that are available and is usually used together with :meth:`load_plugin`.
dgitcore/vendor/pluginbase/pluginbase.py
def list_plugins(self): """Returns a sorted list of all plugins that are available in this plugin source. This can be useful to automatically discover plugins that are available and is usually used together with :meth:`load_plugin`. """ rv = [] for _, modname, ispkg in pkgutil.iter_modules(self.mod.__path__): rv.append(modname) return sorted(rv)
def list_plugins(self): """Returns a sorted list of all plugins that are available in this plugin source. This can be useful to automatically discover plugins that are available and is usually used together with :meth:`load_plugin`. """ rv = [] for _, modname, ispkg in pkgutil.iter_modules(self.mod.__path__): rv.append(modname) return sorted(rv)
[ "Returns", "a", "sorted", "list", "of", "all", "plugins", "that", "are", "available", "in", "this", "plugin", "source", ".", "This", "can", "be", "useful", "to", "automatically", "discover", "plugins", "that", "are", "available", "and", "is", "usually", "used", "together", "with", ":", "meth", ":", "load_plugin", "." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/vendor/pluginbase/pluginbase.py#L249-L258
[ "def", "list_plugins", "(", "self", ")", ":", "rv", "=", "[", "]", "for", "_", ",", "modname", ",", "ispkg", "in", "pkgutil", ".", "iter_modules", "(", "self", ".", "mod", ".", "__path__", ")", ":", "rv", ".", "append", "(", "modname", ")", "return", "sorted", "(", "rv", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
PluginSource.load_plugin
This automatically loads a plugin by the given name from the current source and returns the module. This is a convenient alternative to the import statement and saves you from invoking ``__import__`` or a similar function yourself. :param name: the name of the plugin to load.
dgitcore/vendor/pluginbase/pluginbase.py
def load_plugin(self, name): """This automatically loads a plugin by the given name from the current source and returns the module. This is a convenient alternative to the import statement and saves you from invoking ``__import__`` or a similar function yourself. :param name: the name of the plugin to load. """ if '.' in name: raise ImportError('Plugin names cannot contain dots.') with self: return __import__(self.base.package + '.' + name, globals(), {}, ['__name__'])
def load_plugin(self, name): """This automatically loads a plugin by the given name from the current source and returns the module. This is a convenient alternative to the import statement and saves you from invoking ``__import__`` or a similar function yourself. :param name: the name of the plugin to load. """ if '.' in name: raise ImportError('Plugin names cannot contain dots.') with self: return __import__(self.base.package + '.' + name, globals(), {}, ['__name__'])
[ "This", "automatically", "loads", "a", "plugin", "by", "the", "given", "name", "from", "the", "current", "source", "and", "returns", "the", "module", ".", "This", "is", "a", "convenient", "alternative", "to", "the", "import", "statement", "and", "saves", "you", "from", "invoking", "__import__", "or", "a", "similar", "function", "yourself", "." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/vendor/pluginbase/pluginbase.py#L260-L272
[ "def", "load_plugin", "(", "self", ",", "name", ")", ":", "if", "'.'", "in", "name", ":", "raise", "ImportError", "(", "'Plugin names cannot contain dots.'", ")", "with", "self", ":", "return", "__import__", "(", "self", ".", "base", ".", "package", "+", "'.'", "+", "name", ",", "globals", "(", ")", ",", "{", "}", ",", "[", "'__name__'", "]", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
PluginSource.open_resource
This function locates a resource inside the plugin and returns a byte stream to the contents of it. If the resource cannot be loaded an :exc:`IOError` will be raised. Only plugins that are real Python packages can contain resources. Plain old Python modules do not allow this for obvious reasons. .. versionadded:: 0.3 :param plugin: the name of the plugin to open the resource of. :param filename: the name of the file within the plugin to open.
dgitcore/vendor/pluginbase/pluginbase.py
def open_resource(self, plugin, filename): """This function locates a resource inside the plugin and returns a byte stream to the contents of it. If the resource cannot be loaded an :exc:`IOError` will be raised. Only plugins that are real Python packages can contain resources. Plain old Python modules do not allow this for obvious reasons. .. versionadded:: 0.3 :param plugin: the name of the plugin to open the resource of. :param filename: the name of the file within the plugin to open. """ mod = self.load_plugin(plugin) fn = getattr(mod, '__file__', None) if fn is not None: if fn.endswith(('.pyc', '.pyo')): fn = fn[:-1] if os.path.isfile(fn): return open(os.path.join(os.path.dirname(fn), filename), 'rb') buf = pkgutil.get_data(self.mod.__name__ + '.' + plugin, filename) if buf is None: raise IOError(errno.ENOEXITS, 'Could not find resource') return NativeBytesIO(buf)
def open_resource(self, plugin, filename): """This function locates a resource inside the plugin and returns a byte stream to the contents of it. If the resource cannot be loaded an :exc:`IOError` will be raised. Only plugins that are real Python packages can contain resources. Plain old Python modules do not allow this for obvious reasons. .. versionadded:: 0.3 :param plugin: the name of the plugin to open the resource of. :param filename: the name of the file within the plugin to open. """ mod = self.load_plugin(plugin) fn = getattr(mod, '__file__', None) if fn is not None: if fn.endswith(('.pyc', '.pyo')): fn = fn[:-1] if os.path.isfile(fn): return open(os.path.join(os.path.dirname(fn), filename), 'rb') buf = pkgutil.get_data(self.mod.__name__ + '.' + plugin, filename) if buf is None: raise IOError(errno.ENOEXITS, 'Could not find resource') return NativeBytesIO(buf)
[ "This", "function", "locates", "a", "resource", "inside", "the", "plugin", "and", "returns", "a", "byte", "stream", "to", "the", "contents", "of", "it", ".", "If", "the", "resource", "cannot", "be", "loaded", "an", ":", "exc", ":", "IOError", "will", "be", "raised", ".", "Only", "plugins", "that", "are", "real", "Python", "packages", "can", "contain", "resources", ".", "Plain", "old", "Python", "modules", "do", "not", "allow", "this", "for", "obvious", "reasons", "." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/vendor/pluginbase/pluginbase.py#L274-L296
[ "def", "open_resource", "(", "self", ",", "plugin", ",", "filename", ")", ":", "mod", "=", "self", ".", "load_plugin", "(", "plugin", ")", "fn", "=", "getattr", "(", "mod", ",", "'__file__'", ",", "None", ")", "if", "fn", "is", "not", "None", ":", "if", "fn", ".", "endswith", "(", "(", "'.pyc'", ",", "'.pyo'", ")", ")", ":", "fn", "=", "fn", "[", ":", "-", "1", "]", "if", "os", ".", "path", ".", "isfile", "(", "fn", ")", ":", "return", "open", "(", "os", ".", "path", ".", "join", "(", "os", ".", "path", ".", "dirname", "(", "fn", ")", ",", "filename", ")", ",", "'rb'", ")", "buf", "=", "pkgutil", ".", "get_data", "(", "self", ".", "mod", ".", "__name__", "+", "'.'", "+", "plugin", ",", "filename", ")", "if", "buf", "is", "None", ":", "raise", "IOError", "(", "errno", ".", "ENOEXITS", ",", "'Could not find resource'", ")", "return", "NativeBytesIO", "(", "buf", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
api_call_action
API wrapper documentation
dgitcore/api.py
def api_call_action(func): """ API wrapper documentation """ def _inner(*args, **kwargs): return func(*args, **kwargs) _inner.__name__ = func.__name__ _inner.__doc__ = func.__doc__ return _inner
def api_call_action(func): """ API wrapper documentation """ def _inner(*args, **kwargs): return func(*args, **kwargs) _inner.__name__ = func.__name__ _inner.__doc__ = func.__doc__ return _inner
[ "API", "wrapper", "documentation" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/api.py#L12-L20
[ "def", "api_call_action", "(", "func", ")", ":", "def", "_inner", "(", "*", "args", ",", "*", "*", "kwargs", ")", ":", "return", "func", "(", "*", "args", ",", "*", "*", "kwargs", ")", "_inner", ".", "__name__", "=", "func", ".", "__name__", "_inner", ".", "__doc__", "=", "func", ".", "__doc__", "return", "_inner" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
Framer.tx
Transmit a series of bytes :param message: a list of bytes to send :return: None
booty/framer.py
def tx(self, message): """ Transmit a series of bytes :param message: a list of bytes to send :return: None """ message = message if isinstance(message, list) else [message] length = len(message) length_high_byte = (length & 0xff00) >> 8 length_low_byte = length & 0x00ff message_with_length = [length_low_byte, length_high_byte] + message sum1, sum2 = self._fletcher16_checksum(message_with_length) message_with_length.append(sum1) message_with_length.append(sum2) message = [self._START_OF_FRAME] for b in message_with_length: if b in [self._START_OF_FRAME, self._END_OF_FRAME, self._ESC]: message.append(self._ESC) message.append(b ^ self._ESC_XOR) else: message.append(b) message.append(self._END_OF_FRAME) self._port.write(message)
def tx(self, message): """ Transmit a series of bytes :param message: a list of bytes to send :return: None """ message = message if isinstance(message, list) else [message] length = len(message) length_high_byte = (length & 0xff00) >> 8 length_low_byte = length & 0x00ff message_with_length = [length_low_byte, length_high_byte] + message sum1, sum2 = self._fletcher16_checksum(message_with_length) message_with_length.append(sum1) message_with_length.append(sum2) message = [self._START_OF_FRAME] for b in message_with_length: if b in [self._START_OF_FRAME, self._END_OF_FRAME, self._ESC]: message.append(self._ESC) message.append(b ^ self._ESC_XOR) else: message.append(b) message.append(self._END_OF_FRAME) self._port.write(message)
[ "Transmit", "a", "series", "of", "bytes", ":", "param", "message", ":", "a", "list", "of", "bytes", "to", "send", ":", "return", ":", "None" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L30-L59
[ "def", "tx", "(", "self", ",", "message", ")", ":", "message", "=", "message", "if", "isinstance", "(", "message", ",", "list", ")", "else", "[", "message", "]", "length", "=", "len", "(", "message", ")", "length_high_byte", "=", "(", "length", "&", "0xff00", ")", ">>", "8", "length_low_byte", "=", "length", "&", "0x00ff", "message_with_length", "=", "[", "length_low_byte", ",", "length_high_byte", "]", "+", "message", "sum1", ",", "sum2", "=", "self", ".", "_fletcher16_checksum", "(", "message_with_length", ")", "message_with_length", ".", "append", "(", "sum1", ")", "message_with_length", ".", "append", "(", "sum2", ")", "message", "=", "[", "self", ".", "_START_OF_FRAME", "]", "for", "b", "in", "message_with_length", ":", "if", "b", "in", "[", "self", ".", "_START_OF_FRAME", ",", "self", ".", "_END_OF_FRAME", ",", "self", ".", "_ESC", "]", ":", "message", ".", "append", "(", "self", ".", "_ESC", ")", "message", ".", "append", "(", "b", "^", "self", ".", "_ESC_XOR", ")", "else", ":", "message", ".", "append", "(", "b", ")", "message", ".", "append", "(", "self", ".", "_END_OF_FRAME", ")", "self", ".", "_port", ".", "write", "(", "message", ")" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Framer.rx
Receive a series of bytes that have been verified :return: a series of bytes as a tuple or None if empty
booty/framer.py
def rx(self): """ Receive a series of bytes that have been verified :return: a series of bytes as a tuple or None if empty """ if not self._threaded: self.run() try: return tuple(self._messages.pop(0)) except IndexError: return None
def rx(self): """ Receive a series of bytes that have been verified :return: a series of bytes as a tuple or None if empty """ if not self._threaded: self.run() try: return tuple(self._messages.pop(0)) except IndexError: return None
[ "Receive", "a", "series", "of", "bytes", "that", "have", "been", "verified", ":", "return", ":", "a", "series", "of", "bytes", "as", "a", "tuple", "or", "None", "if", "empty" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L61-L72
[ "def", "rx", "(", "self", ")", ":", "if", "not", "self", ".", "_threaded", ":", "self", ".", "run", "(", ")", "try", ":", "return", "tuple", "(", "self", ".", "_messages", ".", "pop", "(", "0", ")", ")", "except", "IndexError", ":", "return", "None" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Framer._parse_raw_data
Parses the incoming data and determines if it is valid. Valid data gets placed into self._messages :return: None
booty/framer.py
def _parse_raw_data(self): """ Parses the incoming data and determines if it is valid. Valid data gets placed into self._messages :return: None """ if self._START_OF_FRAME in self._raw and self._END_OF_FRAME in self._raw: while self._raw[0] != self._START_OF_FRAME and len(self._raw) > 0: self._raw.pop(0) if self._raw[0] == self._START_OF_FRAME: self._raw.pop(0) eof_index = self._raw.index(self._END_OF_FRAME) raw_message = self._raw[:eof_index] self._raw = self._raw[eof_index:] logger.debug('raw message: {}'.format(raw_message)) message = self._remove_esc_chars(raw_message) logger.debug('message with checksum: {}'.format(message)) expected_checksum = (message[-1] << 8) | message[-2] logger.debug('checksum: {}'.format(expected_checksum)) message = message[:-2] # checksum bytes logger.debug('message: {}'.format(message)) sum1, sum2 = self._fletcher16_checksum(message) calculated_checksum = (sum2 << 8) | sum1 if expected_checksum == calculated_checksum: message = message[2:] # remove length logger.debug('valid message received: {}'.format(message)) self._messages.append(message) else: logger.warning('invalid message received: {}, discarding'.format(message)) logger.debug('expected checksum: {}, calculated checksum: {}'.format(expected_checksum, calculated_checksum)) # remove any extra bytes at the beginning try: while self._raw[0] != self._START_OF_FRAME and len(self._raw) > 0: self._raw.pop(0) except IndexError: pass
def _parse_raw_data(self): """ Parses the incoming data and determines if it is valid. Valid data gets placed into self._messages :return: None """ if self._START_OF_FRAME in self._raw and self._END_OF_FRAME in self._raw: while self._raw[0] != self._START_OF_FRAME and len(self._raw) > 0: self._raw.pop(0) if self._raw[0] == self._START_OF_FRAME: self._raw.pop(0) eof_index = self._raw.index(self._END_OF_FRAME) raw_message = self._raw[:eof_index] self._raw = self._raw[eof_index:] logger.debug('raw message: {}'.format(raw_message)) message = self._remove_esc_chars(raw_message) logger.debug('message with checksum: {}'.format(message)) expected_checksum = (message[-1] << 8) | message[-2] logger.debug('checksum: {}'.format(expected_checksum)) message = message[:-2] # checksum bytes logger.debug('message: {}'.format(message)) sum1, sum2 = self._fletcher16_checksum(message) calculated_checksum = (sum2 << 8) | sum1 if expected_checksum == calculated_checksum: message = message[2:] # remove length logger.debug('valid message received: {}'.format(message)) self._messages.append(message) else: logger.warning('invalid message received: {}, discarding'.format(message)) logger.debug('expected checksum: {}, calculated checksum: {}'.format(expected_checksum, calculated_checksum)) # remove any extra bytes at the beginning try: while self._raw[0] != self._START_OF_FRAME and len(self._raw) > 0: self._raw.pop(0) except IndexError: pass
[ "Parses", "the", "incoming", "data", "and", "determines", "if", "it", "is", "valid", ".", "Valid", "data", "gets", "placed", "into", "self", ".", "_messages", ":", "return", ":", "None" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L83-L128
[ "def", "_parse_raw_data", "(", "self", ")", ":", "if", "self", ".", "_START_OF_FRAME", "in", "self", ".", "_raw", "and", "self", ".", "_END_OF_FRAME", "in", "self", ".", "_raw", ":", "while", "self", ".", "_raw", "[", "0", "]", "!=", "self", ".", "_START_OF_FRAME", "and", "len", "(", "self", ".", "_raw", ")", ">", "0", ":", "self", ".", "_raw", ".", "pop", "(", "0", ")", "if", "self", ".", "_raw", "[", "0", "]", "==", "self", ".", "_START_OF_FRAME", ":", "self", ".", "_raw", ".", "pop", "(", "0", ")", "eof_index", "=", "self", ".", "_raw", ".", "index", "(", "self", ".", "_END_OF_FRAME", ")", "raw_message", "=", "self", ".", "_raw", "[", ":", "eof_index", "]", "self", ".", "_raw", "=", "self", ".", "_raw", "[", "eof_index", ":", "]", "logger", ".", "debug", "(", "'raw message: {}'", ".", "format", "(", "raw_message", ")", ")", "message", "=", "self", ".", "_remove_esc_chars", "(", "raw_message", ")", "logger", ".", "debug", "(", "'message with checksum: {}'", ".", "format", "(", "message", ")", ")", "expected_checksum", "=", "(", "message", "[", "-", "1", "]", "<<", "8", ")", "|", "message", "[", "-", "2", "]", "logger", ".", "debug", "(", "'checksum: {}'", ".", "format", "(", "expected_checksum", ")", ")", "message", "=", "message", "[", ":", "-", "2", "]", "# checksum bytes", "logger", ".", "debug", "(", "'message: {}'", ".", "format", "(", "message", ")", ")", "sum1", ",", "sum2", "=", "self", ".", "_fletcher16_checksum", "(", "message", ")", "calculated_checksum", "=", "(", "sum2", "<<", "8", ")", "|", "sum1", "if", "expected_checksum", "==", "calculated_checksum", ":", "message", "=", "message", "[", "2", ":", "]", "# remove length", "logger", ".", "debug", "(", "'valid message received: {}'", ".", "format", "(", "message", ")", ")", "self", ".", "_messages", ".", "append", "(", "message", ")", "else", ":", "logger", ".", "warning", "(", "'invalid message received: {}, discarding'", ".", "format", "(", "message", ")", ")", "logger", ".", "debug", "(", "'expected checksum: {}, calculated checksum: {}'", ".", "format", "(", "expected_checksum", ",", "calculated_checksum", ")", ")", "# remove any extra bytes at the beginning", "try", ":", "while", "self", ".", "_raw", "[", "0", "]", "!=", "self", ".", "_START_OF_FRAME", "and", "len", "(", "self", ".", "_raw", ")", ">", "0", ":", "self", ".", "_raw", ".", "pop", "(", "0", ")", "except", "IndexError", ":", "pass" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Framer._fletcher16_checksum
Calculates a fletcher16 checksum for the list of bytes :param data: a list of bytes that comprise the message :return:
booty/framer.py
def _fletcher16_checksum(self, data): """ Calculates a fletcher16 checksum for the list of bytes :param data: a list of bytes that comprise the message :return: """ sum1 = 0 sum2 = 0 for i, b in enumerate(data): sum1 += b sum1 &= 0xff # Results wrapped at 16 bits sum2 += sum1 sum2 &= 0xff logger.debug('sum1: {} sum2: {}'.format(sum1, sum2)) return sum1, sum2
def _fletcher16_checksum(self, data): """ Calculates a fletcher16 checksum for the list of bytes :param data: a list of bytes that comprise the message :return: """ sum1 = 0 sum2 = 0 for i, b in enumerate(data): sum1 += b sum1 &= 0xff # Results wrapped at 16 bits sum2 += sum1 sum2 &= 0xff logger.debug('sum1: {} sum2: {}'.format(sum1, sum2)) return sum1, sum2
[ "Calculates", "a", "fletcher16", "checksum", "for", "the", "list", "of", "bytes", ":", "param", "data", ":", "a", "list", "of", "bytes", "that", "comprise", "the", "message", ":", "return", ":" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L130-L147
[ "def", "_fletcher16_checksum", "(", "self", ",", "data", ")", ":", "sum1", "=", "0", "sum2", "=", "0", "for", "i", ",", "b", "in", "enumerate", "(", "data", ")", ":", "sum1", "+=", "b", "sum1", "&=", "0xff", "# Results wrapped at 16 bits", "sum2", "+=", "sum1", "sum2", "&=", "0xff", "logger", ".", "debug", "(", "'sum1: {} sum2: {}'", ".", "format", "(", "sum1", ",", "sum2", ")", ")", "return", "sum1", ",", "sum2" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Framer._remove_esc_chars
Removes any escape characters from the message :param raw_message: a list of bytes containing the un-processed data :return: a message that has the escaped characters appropriately un-escaped
booty/framer.py
def _remove_esc_chars(self, raw_message): """ Removes any escape characters from the message :param raw_message: a list of bytes containing the un-processed data :return: a message that has the escaped characters appropriately un-escaped """ message = [] escape_next = False for c in raw_message: if escape_next: message.append(c ^ self._ESC_XOR) escape_next = False else: if c == self._ESC: escape_next = True else: message.append(c) return message
def _remove_esc_chars(self, raw_message): """ Removes any escape characters from the message :param raw_message: a list of bytes containing the un-processed data :return: a message that has the escaped characters appropriately un-escaped """ message = [] escape_next = False for c in raw_message: if escape_next: message.append(c ^ self._ESC_XOR) escape_next = False else: if c == self._ESC: escape_next = True else: message.append(c) return message
[ "Removes", "any", "escape", "characters", "from", "the", "message", ":", "param", "raw_message", ":", "a", "list", "of", "bytes", "containing", "the", "un", "-", "processed", "data", ":", "return", ":", "a", "message", "that", "has", "the", "escaped", "characters", "appropriately", "un", "-", "escaped" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L149-L167
[ "def", "_remove_esc_chars", "(", "self", ",", "raw_message", ")", ":", "message", "=", "[", "]", "escape_next", "=", "False", "for", "c", "in", "raw_message", ":", "if", "escape_next", ":", "message", ".", "append", "(", "c", "^", "self", ".", "_ESC_XOR", ")", "escape_next", "=", "False", "else", ":", "if", "c", "==", "self", ".", "_ESC", ":", "escape_next", "=", "True", "else", ":", "message", ".", "append", "(", "c", ")", "return", "message" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Framer.run
Receives the serial data into the self._raw buffer :return:
booty/framer.py
def run(self): """ Receives the serial data into the self._raw buffer :return: """ run_once = True while run_once or self._threaded: waiting = self._port.in_waiting if waiting > 0: temp = [int(c) for c in self._port.read(waiting)] self._raw += temp self._parse_raw_data() run_once = False if self._threaded: time.sleep(self._timeout)
def run(self): """ Receives the serial data into the self._raw buffer :return: """ run_once = True while run_once or self._threaded: waiting = self._port.in_waiting if waiting > 0: temp = [int(c) for c in self._port.read(waiting)] self._raw += temp self._parse_raw_data() run_once = False if self._threaded: time.sleep(self._timeout)
[ "Receives", "the", "serial", "data", "into", "the", "self", ".", "_raw", "buffer", ":", "return", ":" ]
slightlynybbled/booty
python
https://github.com/slightlynybbled/booty/blob/17f13f0bc28ad855a3fab895478c85c57f356a38/booty/framer.py#L169-L185
[ "def", "run", "(", "self", ")", ":", "run_once", "=", "True", "while", "run_once", "or", "self", ".", "_threaded", ":", "waiting", "=", "self", ".", "_port", ".", "in_waiting", "if", "waiting", ">", "0", ":", "temp", "=", "[", "int", "(", "c", ")", "for", "c", "in", "self", ".", "_port", ".", "read", "(", "waiting", ")", "]", "self", ".", "_raw", "+=", "temp", "self", ".", "_parse_raw_data", "(", ")", "run_once", "=", "False", "if", "self", ".", "_threaded", ":", "time", ".", "sleep", "(", "self", ".", "_timeout", ")" ]
17f13f0bc28ad855a3fab895478c85c57f356a38
valid
Order.save
Saves this order to Holvi, returns a tuple with the order itself and checkout_uri
holviapi/checkout.py
def save(self): """Saves this order to Holvi, returns a tuple with the order itself and checkout_uri""" if self.code: raise HolviError("Orders cannot be updated") send_json = self.to_holvi_dict() send_json.update({ 'pool': self.api.connection.pool }) url = six.u(self.api.base_url + "order/") stat = self.api.connection.make_post(url, send_json) code = stat["details_uri"].split("/")[-2] # Maybe slightly ugly but I don't want to basically reimplement all but uri formation of the api method return (stat["checkout_uri"], self.api.get_order(code))
def save(self): """Saves this order to Holvi, returns a tuple with the order itself and checkout_uri""" if self.code: raise HolviError("Orders cannot be updated") send_json = self.to_holvi_dict() send_json.update({ 'pool': self.api.connection.pool }) url = six.u(self.api.base_url + "order/") stat = self.api.connection.make_post(url, send_json) code = stat["details_uri"].split("/")[-2] # Maybe slightly ugly but I don't want to basically reimplement all but uri formation of the api method return (stat["checkout_uri"], self.api.get_order(code))
[ "Saves", "this", "order", "to", "Holvi", "returns", "a", "tuple", "with", "the", "order", "itself", "and", "checkout_uri" ]
rambo/python-holviapi
python
https://github.com/rambo/python-holviapi/blob/f57f44e7b0a1030786aafd6f387114abb546bb32/holviapi/checkout.py#L78-L89
[ "def", "save", "(", "self", ")", ":", "if", "self", ".", "code", ":", "raise", "HolviError", "(", "\"Orders cannot be updated\"", ")", "send_json", "=", "self", ".", "to_holvi_dict", "(", ")", "send_json", ".", "update", "(", "{", "'pool'", ":", "self", ".", "api", ".", "connection", ".", "pool", "}", ")", "url", "=", "six", ".", "u", "(", "self", ".", "api", ".", "base_url", "+", "\"order/\"", ")", "stat", "=", "self", ".", "api", ".", "connection", ".", "make_post", "(", "url", ",", "send_json", ")", "code", "=", "stat", "[", "\"details_uri\"", "]", ".", "split", "(", "\"/\"", ")", "[", "-", "2", "]", "# Maybe slightly ugly but I don't want to basically reimplement all but uri formation of the api method", "return", "(", "stat", "[", "\"checkout_uri\"", "]", ",", "self", ".", "api", ".", "get_order", "(", "code", ")", ")" ]
f57f44e7b0a1030786aafd6f387114abb546bb32
valid
untokenize
Return source code based on tokens. This is like tokenize.untokenize(), but it preserves spacing between tokens. So if the original soure code had multiple spaces between some tokens or if escaped newlines were used, those things will be reflected by untokenize().
untokenize.py
def untokenize(tokens): """Return source code based on tokens. This is like tokenize.untokenize(), but it preserves spacing between tokens. So if the original soure code had multiple spaces between some tokens or if escaped newlines were used, those things will be reflected by untokenize(). """ text = '' previous_line = '' last_row = 0 last_column = -1 last_non_whitespace_token_type = None for (token_type, token_string, start, end, line) in tokens: if TOKENIZE_HAS_ENCODING and token_type == tokenize.ENCODING: continue (start_row, start_column) = start (end_row, end_column) = end # Preserve escaped newlines. if ( last_non_whitespace_token_type != tokenize.COMMENT and start_row > last_row and previous_line.endswith(('\\\n', '\\\r\n', '\\\r')) ): text += previous_line[len(previous_line.rstrip(' \t\n\r\\')):] # Preserve spacing. if start_row > last_row: last_column = 0 if start_column > last_column: text += line[last_column:start_column] text += token_string previous_line = line last_row = end_row last_column = end_column if token_type not in WHITESPACE_TOKENS: last_non_whitespace_token_type = token_type return text
def untokenize(tokens): """Return source code based on tokens. This is like tokenize.untokenize(), but it preserves spacing between tokens. So if the original soure code had multiple spaces between some tokens or if escaped newlines were used, those things will be reflected by untokenize(). """ text = '' previous_line = '' last_row = 0 last_column = -1 last_non_whitespace_token_type = None for (token_type, token_string, start, end, line) in tokens: if TOKENIZE_HAS_ENCODING and token_type == tokenize.ENCODING: continue (start_row, start_column) = start (end_row, end_column) = end # Preserve escaped newlines. if ( last_non_whitespace_token_type != tokenize.COMMENT and start_row > last_row and previous_line.endswith(('\\\n', '\\\r\n', '\\\r')) ): text += previous_line[len(previous_line.rstrip(' \t\n\r\\')):] # Preserve spacing. if start_row > last_row: last_column = 0 if start_column > last_column: text += line[last_column:start_column] text += token_string previous_line = line last_row = end_row last_column = end_column if token_type not in WHITESPACE_TOKENS: last_non_whitespace_token_type = token_type return text
[ "Return", "source", "code", "based", "on", "tokens", "." ]
myint/untokenize
python
https://github.com/myint/untokenize/blob/137ae8b8ec03e94444325172451ba2104c8ee05e/untokenize.py#L36-L82
[ "def", "untokenize", "(", "tokens", ")", ":", "text", "=", "''", "previous_line", "=", "''", "last_row", "=", "0", "last_column", "=", "-", "1", "last_non_whitespace_token_type", "=", "None", "for", "(", "token_type", ",", "token_string", ",", "start", ",", "end", ",", "line", ")", "in", "tokens", ":", "if", "TOKENIZE_HAS_ENCODING", "and", "token_type", "==", "tokenize", ".", "ENCODING", ":", "continue", "(", "start_row", ",", "start_column", ")", "=", "start", "(", "end_row", ",", "end_column", ")", "=", "end", "# Preserve escaped newlines.", "if", "(", "last_non_whitespace_token_type", "!=", "tokenize", ".", "COMMENT", "and", "start_row", ">", "last_row", "and", "previous_line", ".", "endswith", "(", "(", "'\\\\\\n'", ",", "'\\\\\\r\\n'", ",", "'\\\\\\r'", ")", ")", ")", ":", "text", "+=", "previous_line", "[", "len", "(", "previous_line", ".", "rstrip", "(", "' \\t\\n\\r\\\\'", ")", ")", ":", "]", "# Preserve spacing.", "if", "start_row", ">", "last_row", ":", "last_column", "=", "0", "if", "start_column", ">", "last_column", ":", "text", "+=", "line", "[", "last_column", ":", "start_column", "]", "text", "+=", "token_string", "previous_line", "=", "line", "last_row", "=", "end_row", "last_column", "=", "end_column", "if", "token_type", "not", "in", "WHITESPACE_TOKENS", ":", "last_non_whitespace_token_type", "=", "token_type", "return", "text" ]
137ae8b8ec03e94444325172451ba2104c8ee05e
valid
init
Load profile INI
dgitcore/config.py
def init(globalvars=None, show=False): """ Load profile INI """ global config profileini = getprofileini() if os.path.exists(profileini): config = configparser.ConfigParser() config.read(profileini) mgr = plugins_get_mgr() mgr.update_configs(config) if show: for source in config: print("[%s] :" %(source)) for k in config[source]: print(" %s : %s" % (k, config[source][k])) else: print("Profile does not exist. So creating one") if not show: update(globalvars) print("Complete init")
def init(globalvars=None, show=False): """ Load profile INI """ global config profileini = getprofileini() if os.path.exists(profileini): config = configparser.ConfigParser() config.read(profileini) mgr = plugins_get_mgr() mgr.update_configs(config) if show: for source in config: print("[%s] :" %(source)) for k in config[source]: print(" %s : %s" % (k, config[source][k])) else: print("Profile does not exist. So creating one") if not show: update(globalvars) print("Complete init")
[ "Load", "profile", "INI" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/config.py#L79-L103
[ "def", "init", "(", "globalvars", "=", "None", ",", "show", "=", "False", ")", ":", "global", "config", "profileini", "=", "getprofileini", "(", ")", "if", "os", ".", "path", ".", "exists", "(", "profileini", ")", ":", "config", "=", "configparser", ".", "ConfigParser", "(", ")", "config", ".", "read", "(", "profileini", ")", "mgr", "=", "plugins_get_mgr", "(", ")", "mgr", ".", "update_configs", "(", "config", ")", "if", "show", ":", "for", "source", "in", "config", ":", "print", "(", "\"[%s] :\"", "%", "(", "source", ")", ")", "for", "k", "in", "config", "[", "source", "]", ":", "print", "(", "\" %s : %s\"", "%", "(", "k", ",", "config", "[", "source", "]", "[", "k", "]", ")", ")", "else", ":", "print", "(", "\"Profile does not exist. So creating one\"", ")", "if", "not", "show", ":", "update", "(", "globalvars", ")", "print", "(", "\"Complete init\"", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
update
Update the profile
dgitcore/config.py
def update(globalvars): """ Update the profile """ global config profileini = getprofileini() config = configparser.ConfigParser() config.read(profileini) defaults = {} if globalvars is not None: defaults = {a[0]: a[1] for a in globalvars } # Generic variables to be captured... generic_configs = [{ 'name': 'User', 'nature': 'generic', 'description': "General information", 'variables': ['user.email', 'user.name', 'user.fullname'], 'defaults': { 'user.email': { 'value': defaults.get('user.email',''), 'description': "Email address", 'validator': EmailValidator() }, 'user.fullname': { 'value': defaults.get('user.fullname',''), 'description': "Full Name", 'validator': NonEmptyValidator() }, 'user.name': { 'value': defaults.get('user.name', getpass.getuser()), 'description': "Name", 'validator': NonEmptyValidator() }, } }] # Gather configuration requirements from all plugins mgr = plugins_get_mgr() extra_configs = mgr.gather_configs() allconfigs = generic_configs + extra_configs # Read the existing config and update the defaults for c in allconfigs: name = c['name'] for v in c['variables']: try: c['defaults'][v]['value'] = config[name][v] except: continue for c in allconfigs: print("") print(c['description']) print("==================") if len(c['variables']) == 0: print("Nothing to do. Enabled by default") continue name = c['name'] config[name] = {} config[name]['nature'] = c['nature'] for v in c['variables']: # defaults value = '' description = v + " " helptext = "" validator = None # Look up pre-set values if v in c['defaults']: value = c['defaults'][v].get('value','') helptext = c['defaults'][v].get("description","") validator = c['defaults'][v].get('validator',None) if helptext != "": description += "(" + helptext + ")" # Get user input.. while True: choice = input_with_default(description, value) if validator is not None: if validator.is_valid(choice): break else: print("Invalid input. Expected input is {}".format(validator.message)) else: break config[name][v] = choice if v == 'enable' and choice == 'n': break with open(profileini, 'w') as fd: config.write(fd) print("Updated profile file:", config)
def update(globalvars): """ Update the profile """ global config profileini = getprofileini() config = configparser.ConfigParser() config.read(profileini) defaults = {} if globalvars is not None: defaults = {a[0]: a[1] for a in globalvars } # Generic variables to be captured... generic_configs = [{ 'name': 'User', 'nature': 'generic', 'description': "General information", 'variables': ['user.email', 'user.name', 'user.fullname'], 'defaults': { 'user.email': { 'value': defaults.get('user.email',''), 'description': "Email address", 'validator': EmailValidator() }, 'user.fullname': { 'value': defaults.get('user.fullname',''), 'description': "Full Name", 'validator': NonEmptyValidator() }, 'user.name': { 'value': defaults.get('user.name', getpass.getuser()), 'description': "Name", 'validator': NonEmptyValidator() }, } }] # Gather configuration requirements from all plugins mgr = plugins_get_mgr() extra_configs = mgr.gather_configs() allconfigs = generic_configs + extra_configs # Read the existing config and update the defaults for c in allconfigs: name = c['name'] for v in c['variables']: try: c['defaults'][v]['value'] = config[name][v] except: continue for c in allconfigs: print("") print(c['description']) print("==================") if len(c['variables']) == 0: print("Nothing to do. Enabled by default") continue name = c['name'] config[name] = {} config[name]['nature'] = c['nature'] for v in c['variables']: # defaults value = '' description = v + " " helptext = "" validator = None # Look up pre-set values if v in c['defaults']: value = c['defaults'][v].get('value','') helptext = c['defaults'][v].get("description","") validator = c['defaults'][v].get('validator',None) if helptext != "": description += "(" + helptext + ")" # Get user input.. while True: choice = input_with_default(description, value) if validator is not None: if validator.is_valid(choice): break else: print("Invalid input. Expected input is {}".format(validator.message)) else: break config[name][v] = choice if v == 'enable' and choice == 'n': break with open(profileini, 'w') as fd: config.write(fd) print("Updated profile file:", config)
[ "Update", "the", "profile" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/config.py#L110-L211
[ "def", "update", "(", "globalvars", ")", ":", "global", "config", "profileini", "=", "getprofileini", "(", ")", "config", "=", "configparser", ".", "ConfigParser", "(", ")", "config", ".", "read", "(", "profileini", ")", "defaults", "=", "{", "}", "if", "globalvars", "is", "not", "None", ":", "defaults", "=", "{", "a", "[", "0", "]", ":", "a", "[", "1", "]", "for", "a", "in", "globalvars", "}", "# Generic variables to be captured...", "generic_configs", "=", "[", "{", "'name'", ":", "'User'", ",", "'nature'", ":", "'generic'", ",", "'description'", ":", "\"General information\"", ",", "'variables'", ":", "[", "'user.email'", ",", "'user.name'", ",", "'user.fullname'", "]", ",", "'defaults'", ":", "{", "'user.email'", ":", "{", "'value'", ":", "defaults", ".", "get", "(", "'user.email'", ",", "''", ")", ",", "'description'", ":", "\"Email address\"", ",", "'validator'", ":", "EmailValidator", "(", ")", "}", ",", "'user.fullname'", ":", "{", "'value'", ":", "defaults", ".", "get", "(", "'user.fullname'", ",", "''", ")", ",", "'description'", ":", "\"Full Name\"", ",", "'validator'", ":", "NonEmptyValidator", "(", ")", "}", ",", "'user.name'", ":", "{", "'value'", ":", "defaults", ".", "get", "(", "'user.name'", ",", "getpass", ".", "getuser", "(", ")", ")", ",", "'description'", ":", "\"Name\"", ",", "'validator'", ":", "NonEmptyValidator", "(", ")", "}", ",", "}", "}", "]", "# Gather configuration requirements from all plugins", "mgr", "=", "plugins_get_mgr", "(", ")", "extra_configs", "=", "mgr", ".", "gather_configs", "(", ")", "allconfigs", "=", "generic_configs", "+", "extra_configs", "# Read the existing config and update the defaults", "for", "c", "in", "allconfigs", ":", "name", "=", "c", "[", "'name'", "]", "for", "v", "in", "c", "[", "'variables'", "]", ":", "try", ":", "c", "[", "'defaults'", "]", "[", "v", "]", "[", "'value'", "]", "=", "config", "[", "name", "]", "[", "v", "]", "except", ":", "continue", "for", "c", "in", "allconfigs", ":", "print", "(", "\"\"", ")", "print", "(", "c", "[", "'description'", "]", ")", "print", "(", "\"==================\"", ")", "if", "len", "(", "c", "[", "'variables'", "]", ")", "==", "0", ":", "print", "(", "\"Nothing to do. Enabled by default\"", ")", "continue", "name", "=", "c", "[", "'name'", "]", "config", "[", "name", "]", "=", "{", "}", "config", "[", "name", "]", "[", "'nature'", "]", "=", "c", "[", "'nature'", "]", "for", "v", "in", "c", "[", "'variables'", "]", ":", "# defaults", "value", "=", "''", "description", "=", "v", "+", "\" \"", "helptext", "=", "\"\"", "validator", "=", "None", "# Look up pre-set values", "if", "v", "in", "c", "[", "'defaults'", "]", ":", "value", "=", "c", "[", "'defaults'", "]", "[", "v", "]", ".", "get", "(", "'value'", ",", "''", ")", "helptext", "=", "c", "[", "'defaults'", "]", "[", "v", "]", ".", "get", "(", "\"description\"", ",", "\"\"", ")", "validator", "=", "c", "[", "'defaults'", "]", "[", "v", "]", ".", "get", "(", "'validator'", ",", "None", ")", "if", "helptext", "!=", "\"\"", ":", "description", "+=", "\"(\"", "+", "helptext", "+", "\")\"", "# Get user input..", "while", "True", ":", "choice", "=", "input_with_default", "(", "description", ",", "value", ")", "if", "validator", "is", "not", "None", ":", "if", "validator", ".", "is_valid", "(", "choice", ")", ":", "break", "else", ":", "print", "(", "\"Invalid input. Expected input is {}\"", ".", "format", "(", "validator", ".", "message", ")", ")", "else", ":", "break", "config", "[", "name", "]", "[", "v", "]", "=", "choice", "if", "v", "==", "'enable'", "and", "choice", "==", "'n'", ":", "break", "with", "open", "(", "profileini", ",", "'w'", ")", "as", "fd", ":", "config", ".", "write", "(", "fd", ")", "print", "(", "\"Updated profile file:\"", ",", "config", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
S3Backend.init_repo
Insert hook into the repo
dgitcore/contrib/backends/s3.py
def init_repo(self, gitdir): """ Insert hook into the repo """ hooksdir = os.path.join(gitdir, 'hooks') content = postreceive_template % { 'client': self.client, 'bucket': self.bucket, 's3cfg': self.s3cfg, 'prefix': self.prefix } postrecv_filename =os.path.join(hooksdir, 'post-receive') with open(postrecv_filename,'w') as fd: fd.write(content) self.make_hook_executable(postrecv_filename) print("Wrote to", postrecv_filename)
def init_repo(self, gitdir): """ Insert hook into the repo """ hooksdir = os.path.join(gitdir, 'hooks') content = postreceive_template % { 'client': self.client, 'bucket': self.bucket, 's3cfg': self.s3cfg, 'prefix': self.prefix } postrecv_filename =os.path.join(hooksdir, 'post-receive') with open(postrecv_filename,'w') as fd: fd.write(content) self.make_hook_executable(postrecv_filename) print("Wrote to", postrecv_filename)
[ "Insert", "hook", "into", "the", "repo" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/contrib/backends/s3.py#L133-L151
[ "def", "init_repo", "(", "self", ",", "gitdir", ")", ":", "hooksdir", "=", "os", ".", "path", ".", "join", "(", "gitdir", ",", "'hooks'", ")", "content", "=", "postreceive_template", "%", "{", "'client'", ":", "self", ".", "client", ",", "'bucket'", ":", "self", ".", "bucket", ",", "'s3cfg'", ":", "self", ".", "s3cfg", ",", "'prefix'", ":", "self", ".", "prefix", "}", "postrecv_filename", "=", "os", ".", "path", ".", "join", "(", "hooksdir", ",", "'post-receive'", ")", "with", "open", "(", "postrecv_filename", ",", "'w'", ")", "as", "fd", ":", "fd", ".", "write", "(", "content", ")", "self", ".", "make_hook_executable", "(", "postrecv_filename", ")", "print", "(", "\"Wrote to\"", ",", "postrecv_filename", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
compute_sha256
Try the library. If it doesnt work, use the command line..
dgitcore/helper.py
def compute_sha256(filename): """ Try the library. If it doesnt work, use the command line.. """ try: h = sha256() fd = open(filename, 'rb') while True: buf = fd.read(0x1000000) if buf in [None, ""]: break h.update(buf.encode('utf-8')) fd.close() return h.hexdigest() except: output = run(["sha256sum", "-b", filename]) return output.split(" ")[0]
def compute_sha256(filename): """ Try the library. If it doesnt work, use the command line.. """ try: h = sha256() fd = open(filename, 'rb') while True: buf = fd.read(0x1000000) if buf in [None, ""]: break h.update(buf.encode('utf-8')) fd.close() return h.hexdigest() except: output = run(["sha256sum", "-b", filename]) return output.split(" ")[0]
[ "Try", "the", "library", ".", "If", "it", "doesnt", "work", "use", "the", "command", "line", ".." ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/helper.py#L112-L128
[ "def", "compute_sha256", "(", "filename", ")", ":", "try", ":", "h", "=", "sha256", "(", ")", "fd", "=", "open", "(", "filename", ",", "'rb'", ")", "while", "True", ":", "buf", "=", "fd", ".", "read", "(", "0x1000000", ")", "if", "buf", "in", "[", "None", ",", "\"\"", "]", ":", "break", "h", ".", "update", "(", "buf", ".", "encode", "(", "'utf-8'", ")", ")", "fd", ".", "close", "(", ")", "return", "h", ".", "hexdigest", "(", ")", "except", ":", "output", "=", "run", "(", "[", "\"sha256sum\"", ",", "\"-b\"", ",", "filename", "]", ")", "return", "output", ".", "split", "(", "\" \"", ")", "[", "0", "]" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
run
Run a shell command
dgitcore/helper.py
def run(cmd): """ Run a shell command """ cmd = [pipes.quote(c) for c in cmd] cmd = " ".join(cmd) cmd += "; exit 0" # print("Running {} in {}".format(cmd, os.getcwd())) try: output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, shell=True) except subprocess.CalledProcessError as e: output = e.output output = output.decode('utf-8') output = output.strip() return output
def run(cmd): """ Run a shell command """ cmd = [pipes.quote(c) for c in cmd] cmd = " ".join(cmd) cmd += "; exit 0" # print("Running {} in {}".format(cmd, os.getcwd())) try: output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, shell=True) except subprocess.CalledProcessError as e: output = e.output output = output.decode('utf-8') output = output.strip() return output
[ "Run", "a", "shell", "command" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/helper.py#L130-L147
[ "def", "run", "(", "cmd", ")", ":", "cmd", "=", "[", "pipes", ".", "quote", "(", "c", ")", "for", "c", "in", "cmd", "]", "cmd", "=", "\" \"", ".", "join", "(", "cmd", ")", "cmd", "+=", "\"; exit 0\"", "# print(\"Running {} in {}\".format(cmd, os.getcwd()))", "try", ":", "output", "=", "subprocess", ".", "check_output", "(", "cmd", ",", "stderr", "=", "subprocess", ".", "STDOUT", ",", "shell", "=", "True", ")", "except", "subprocess", ".", "CalledProcessError", "as", "e", ":", "output", "=", "e", ".", "output", "output", "=", "output", ".", "decode", "(", "'utf-8'", ")", "output", "=", "output", ".", "strip", "(", ")", "return", "output" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
log_repo_action
Log all repo actions to .dgit/log.json
dgitcore/helper.py
def log_repo_action(func): """ Log all repo actions to .dgit/log.json """ def _inner(*args, **kwargs): result = func(*args, **kwargs) log_action(func, result, *args, **kwargs) return result _inner.__name__ = func.__name__ _inner.__doc__ = func.__doc__ return _inner
def log_repo_action(func): """ Log all repo actions to .dgit/log.json """ def _inner(*args, **kwargs): result = func(*args, **kwargs) log_action(func, result, *args, **kwargs) return result _inner.__name__ = func.__name__ _inner.__doc__ = func.__doc__ return _inner
[ "Log", "all", "repo", "actions", "to", ".", "dgit", "/", "log", ".", "json" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/helper.py#L232-L244
[ "def", "log_repo_action", "(", "func", ")", ":", "def", "_inner", "(", "*", "args", ",", "*", "*", "kwargs", ")", ":", "result", "=", "func", "(", "*", "args", ",", "*", "*", "kwargs", ")", "log_action", "(", "func", ",", "result", ",", "*", "args", ",", "*", "*", "kwargs", ")", "return", "result", "_inner", ".", "__name__", "=", "func", ".", "__name__", "_inner", ".", "__doc__", "=", "func", ".", "__doc__", "return", "_inner" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
get_tree
Get the commit history for a given dataset
dgitcore/datasets/history.py
def get_tree(gitdir="."): """ Get the commit history for a given dataset """ cmd = ["git", "log", "--all", "--branches", '--pretty=format:{ "commit": "%H", "abbreviated_commit": "%h", "tree": "%T", "abbreviated_tree": "%t", "parent": "%P", "abbreviated_parent": "%p", "refs": "%d", "encoding": "%e", "subject": "%s", "sanitized_subject_line": "%f", "commit_notes": "", "author": { "name": "%aN", "email": "%aE", "date": "%ai" }, "commiter": { "name": "%cN", "email": "%cE", "date": "%ci" }},'] output = run(cmd) lines = output.split("\n") content = "" history = [] for l in lines: try: revisedcontent = content + l if revisedcontent.count('"') % 2 == 0: j = json.loads(revisedcontent[:-1]) if "Notes added by" in j['subject']: content = "" continue history.append(j) content = "" else: content = revisedcontent except Exception as e: print("Error while parsing record") print(revisedcontent) content = "" # Order by time. First commit first... history.reverse() # changes = get_change() for i in range(len(history)): abbrev_commit = history[i]['abbreviated_commit'] if abbrev_commit not in changes: raise Exception("Missing changes for " + abbrev_commit) history[i]['changes'] = changes[abbrev_commit]['changes'] return history
def get_tree(gitdir="."): """ Get the commit history for a given dataset """ cmd = ["git", "log", "--all", "--branches", '--pretty=format:{ "commit": "%H", "abbreviated_commit": "%h", "tree": "%T", "abbreviated_tree": "%t", "parent": "%P", "abbreviated_parent": "%p", "refs": "%d", "encoding": "%e", "subject": "%s", "sanitized_subject_line": "%f", "commit_notes": "", "author": { "name": "%aN", "email": "%aE", "date": "%ai" }, "commiter": { "name": "%cN", "email": "%cE", "date": "%ci" }},'] output = run(cmd) lines = output.split("\n") content = "" history = [] for l in lines: try: revisedcontent = content + l if revisedcontent.count('"') % 2 == 0: j = json.loads(revisedcontent[:-1]) if "Notes added by" in j['subject']: content = "" continue history.append(j) content = "" else: content = revisedcontent except Exception as e: print("Error while parsing record") print(revisedcontent) content = "" # Order by time. First commit first... history.reverse() # changes = get_change() for i in range(len(history)): abbrev_commit = history[i]['abbreviated_commit'] if abbrev_commit not in changes: raise Exception("Missing changes for " + abbrev_commit) history[i]['changes'] = changes[abbrev_commit]['changes'] return history
[ "Get", "the", "commit", "history", "for", "a", "given", "dataset" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/datasets/history.py#L62-L105
[ "def", "get_tree", "(", "gitdir", "=", "\".\"", ")", ":", "cmd", "=", "[", "\"git\"", ",", "\"log\"", ",", "\"--all\"", ",", "\"--branches\"", ",", "'--pretty=format:{ \"commit\": \"%H\", \"abbreviated_commit\": \"%h\", \"tree\": \"%T\", \"abbreviated_tree\": \"%t\", \"parent\": \"%P\", \"abbreviated_parent\": \"%p\", \"refs\": \"%d\", \"encoding\": \"%e\", \"subject\": \"%s\", \"sanitized_subject_line\": \"%f\", \"commit_notes\": \"\", \"author\": { \"name\": \"%aN\", \"email\": \"%aE\", \"date\": \"%ai\" }, \"commiter\": { \"name\": \"%cN\", \"email\": \"%cE\", \"date\": \"%ci\" }},'", "]", "output", "=", "run", "(", "cmd", ")", "lines", "=", "output", ".", "split", "(", "\"\\n\"", ")", "content", "=", "\"\"", "history", "=", "[", "]", "for", "l", "in", "lines", ":", "try", ":", "revisedcontent", "=", "content", "+", "l", "if", "revisedcontent", ".", "count", "(", "'\"'", ")", "%", "2", "==", "0", ":", "j", "=", "json", ".", "loads", "(", "revisedcontent", "[", ":", "-", "1", "]", ")", "if", "\"Notes added by\"", "in", "j", "[", "'subject'", "]", ":", "content", "=", "\"\"", "continue", "history", ".", "append", "(", "j", ")", "content", "=", "\"\"", "else", ":", "content", "=", "revisedcontent", "except", "Exception", "as", "e", ":", "print", "(", "\"Error while parsing record\"", ")", "print", "(", "revisedcontent", ")", "content", "=", "\"\"", "# Order by time. First commit first...", "history", ".", "reverse", "(", ")", "#", "changes", "=", "get_change", "(", ")", "for", "i", "in", "range", "(", "len", "(", "history", ")", ")", ":", "abbrev_commit", "=", "history", "[", "i", "]", "[", "'abbreviated_commit'", "]", "if", "abbrev_commit", "not", "in", "changes", ":", "raise", "Exception", "(", "\"Missing changes for \"", "+", "abbrev_commit", ")", "history", "[", "i", "]", "[", "'changes'", "]", "=", "changes", "[", "abbrev_commit", "]", "[", "'changes'", "]", "return", "history" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
get_diffs
Look at files and compute the diffs intelligently
dgitcore/datasets/history.py
def get_diffs(history): """ Look at files and compute the diffs intelligently """ # First get all possible representations mgr = plugins_get_mgr() keys = mgr.search('representation')['representation'] representations = [mgr.get_by_key('representation', k) for k in keys] for i in range(len(history)): if i+1 > len(history) - 1: continue prev = history[i] curr = history[i+1] #print(prev['subject'], "==>", curr['subject']) #print(curr['changes']) for c in curr['changes']: path = c['path'] # Skip the metadata file if c['path'].endswith('datapackage.json'): continue # Find a handler for this kind of file... handler = None for r in representations: if r.can_process(path): handler = r break if handler is None: continue # print(path, "being handled by", handler) v1_hex = prev['commit'] v2_hex = curr['commit'] temp1 = tempfile.mkdtemp(prefix="dgit-diff-") try: for h in [v1_hex, v2_hex]: filename = '{}/{}/checkout.tar'.format(temp1, h) try: os.makedirs(os.path.dirname(filename)) except: pass extractcmd = ['git', 'archive', '-o', filename, h, path] output = run(extractcmd) if 'fatal' in output: raise Exception("File not present in commit") with cd(os.path.dirname(filename)): cmd = ['tar', 'xvf', 'checkout.tar'] output = run(cmd) if 'fatal' in output: print("Cleaning up - fatal 1", temp1) shutil.rmtree(temp1) continue # Check to make sure that path1 = os.path.join(temp1, v1_hex, path) path2 = os.path.join(temp1, v2_hex, path) if not os.path.exists(path1) or not os.path.exists(path2): # print("One of the two output files is missing") shutil.rmtree(temp1) continue #print(path1, path2) # Now call the handler diff = handler.get_diff(path1, path2) # print("Inserting diff", diff) c['diff'] = diff except Exception as e: #traceback.print_exc() #print("Cleaning up - Exception ", temp1) shutil.rmtree(temp1)
def get_diffs(history): """ Look at files and compute the diffs intelligently """ # First get all possible representations mgr = plugins_get_mgr() keys = mgr.search('representation')['representation'] representations = [mgr.get_by_key('representation', k) for k in keys] for i in range(len(history)): if i+1 > len(history) - 1: continue prev = history[i] curr = history[i+1] #print(prev['subject'], "==>", curr['subject']) #print(curr['changes']) for c in curr['changes']: path = c['path'] # Skip the metadata file if c['path'].endswith('datapackage.json'): continue # Find a handler for this kind of file... handler = None for r in representations: if r.can_process(path): handler = r break if handler is None: continue # print(path, "being handled by", handler) v1_hex = prev['commit'] v2_hex = curr['commit'] temp1 = tempfile.mkdtemp(prefix="dgit-diff-") try: for h in [v1_hex, v2_hex]: filename = '{}/{}/checkout.tar'.format(temp1, h) try: os.makedirs(os.path.dirname(filename)) except: pass extractcmd = ['git', 'archive', '-o', filename, h, path] output = run(extractcmd) if 'fatal' in output: raise Exception("File not present in commit") with cd(os.path.dirname(filename)): cmd = ['tar', 'xvf', 'checkout.tar'] output = run(cmd) if 'fatal' in output: print("Cleaning up - fatal 1", temp1) shutil.rmtree(temp1) continue # Check to make sure that path1 = os.path.join(temp1, v1_hex, path) path2 = os.path.join(temp1, v2_hex, path) if not os.path.exists(path1) or not os.path.exists(path2): # print("One of the two output files is missing") shutil.rmtree(temp1) continue #print(path1, path2) # Now call the handler diff = handler.get_diff(path1, path2) # print("Inserting diff", diff) c['diff'] = diff except Exception as e: #traceback.print_exc() #print("Cleaning up - Exception ", temp1) shutil.rmtree(temp1)
[ "Look", "at", "files", "and", "compute", "the", "diffs", "intelligently" ]
pingali/dgit
python
https://github.com/pingali/dgit/blob/ecde01f40b98f0719dbcfb54452270ed2f86686d/dgitcore/datasets/history.py#L196-L278
[ "def", "get_diffs", "(", "history", ")", ":", "# First get all possible representations", "mgr", "=", "plugins_get_mgr", "(", ")", "keys", "=", "mgr", ".", "search", "(", "'representation'", ")", "[", "'representation'", "]", "representations", "=", "[", "mgr", ".", "get_by_key", "(", "'representation'", ",", "k", ")", "for", "k", "in", "keys", "]", "for", "i", "in", "range", "(", "len", "(", "history", ")", ")", ":", "if", "i", "+", "1", ">", "len", "(", "history", ")", "-", "1", ":", "continue", "prev", "=", "history", "[", "i", "]", "curr", "=", "history", "[", "i", "+", "1", "]", "#print(prev['subject'], \"==>\", curr['subject'])", "#print(curr['changes'])", "for", "c", "in", "curr", "[", "'changes'", "]", ":", "path", "=", "c", "[", "'path'", "]", "# Skip the metadata file", "if", "c", "[", "'path'", "]", ".", "endswith", "(", "'datapackage.json'", ")", ":", "continue", "# Find a handler for this kind of file...", "handler", "=", "None", "for", "r", "in", "representations", ":", "if", "r", ".", "can_process", "(", "path", ")", ":", "handler", "=", "r", "break", "if", "handler", "is", "None", ":", "continue", "# print(path, \"being handled by\", handler)", "v1_hex", "=", "prev", "[", "'commit'", "]", "v2_hex", "=", "curr", "[", "'commit'", "]", "temp1", "=", "tempfile", ".", "mkdtemp", "(", "prefix", "=", "\"dgit-diff-\"", ")", "try", ":", "for", "h", "in", "[", "v1_hex", ",", "v2_hex", "]", ":", "filename", "=", "'{}/{}/checkout.tar'", ".", "format", "(", "temp1", ",", "h", ")", "try", ":", "os", ".", "makedirs", "(", "os", ".", "path", ".", "dirname", "(", "filename", ")", ")", "except", ":", "pass", "extractcmd", "=", "[", "'git'", ",", "'archive'", ",", "'-o'", ",", "filename", ",", "h", ",", "path", "]", "output", "=", "run", "(", "extractcmd", ")", "if", "'fatal'", "in", "output", ":", "raise", "Exception", "(", "\"File not present in commit\"", ")", "with", "cd", "(", "os", ".", "path", ".", "dirname", "(", "filename", ")", ")", ":", "cmd", "=", "[", "'tar'", ",", "'xvf'", ",", "'checkout.tar'", "]", "output", "=", "run", "(", "cmd", ")", "if", "'fatal'", "in", "output", ":", "print", "(", "\"Cleaning up - fatal 1\"", ",", "temp1", ")", "shutil", ".", "rmtree", "(", "temp1", ")", "continue", "# Check to make sure that ", "path1", "=", "os", ".", "path", ".", "join", "(", "temp1", ",", "v1_hex", ",", "path", ")", "path2", "=", "os", ".", "path", ".", "join", "(", "temp1", ",", "v2_hex", ",", "path", ")", "if", "not", "os", ".", "path", ".", "exists", "(", "path1", ")", "or", "not", "os", ".", "path", ".", "exists", "(", "path2", ")", ":", "# print(\"One of the two output files is missing\") ", "shutil", ".", "rmtree", "(", "temp1", ")", "continue", "#print(path1, path2) ", "# Now call the handler", "diff", "=", "handler", ".", "get_diff", "(", "path1", ",", "path2", ")", "# print(\"Inserting diff\", diff)", "c", "[", "'diff'", "]", "=", "diff", "except", "Exception", "as", "e", ":", "#traceback.print_exc() ", "#print(\"Cleaning up - Exception \", temp1)", "shutil", ".", "rmtree", "(", "temp1", ")" ]
ecde01f40b98f0719dbcfb54452270ed2f86686d
valid
SSHClient.chdir
Parameters ---------- new_pwd: str, Directory to change to relative: bool, default True If True then the given directory is treated as relative to the current directory
poseidon/ssh.py
def chdir(self, new_pwd, relative=True): """ Parameters ---------- new_pwd: str, Directory to change to relative: bool, default True If True then the given directory is treated as relative to the current directory """ if new_pwd and self.pwd and relative: new_pwd = os.path.join(self.pwd, new_pwd) self.pwd = new_pwd
def chdir(self, new_pwd, relative=True): """ Parameters ---------- new_pwd: str, Directory to change to relative: bool, default True If True then the given directory is treated as relative to the current directory """ if new_pwd and self.pwd and relative: new_pwd = os.path.join(self.pwd, new_pwd) self.pwd = new_pwd
[ "Parameters", "----------", "new_pwd", ":", "str", "Directory", "to", "change", "to", "relative", ":", "bool", "default", "True", "If", "True", "then", "the", "given", "directory", "is", "treated", "as", "relative", "to", "the", "current", "directory" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L60-L72
[ "def", "chdir", "(", "self", ",", "new_pwd", ",", "relative", "=", "True", ")", ":", "if", "new_pwd", "and", "self", ".", "pwd", "and", "relative", ":", "new_pwd", "=", "os", ".", "path", ".", "join", "(", "self", ".", "pwd", ",", "new_pwd", ")", "self", ".", "pwd", "=", "new_pwd" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.exec_command
Proceed with caution, if you run a command that causes a prompt and then try to read/print the stdout it's going to block forever Returns ------- (stdin, stdout, stderr)
poseidon/ssh.py
def exec_command(self, cmd): """ Proceed with caution, if you run a command that causes a prompt and then try to read/print the stdout it's going to block forever Returns ------- (stdin, stdout, stderr) """ if self.pwd is not None: cmd = 'cd %s ; %s' % (self.pwd, cmd) if self.interactive: print(cmd) return self.con.exec_command(cmd)
def exec_command(self, cmd): """ Proceed with caution, if you run a command that causes a prompt and then try to read/print the stdout it's going to block forever Returns ------- (stdin, stdout, stderr) """ if self.pwd is not None: cmd = 'cd %s ; %s' % (self.pwd, cmd) if self.interactive: print(cmd) return self.con.exec_command(cmd)
[ "Proceed", "with", "caution", "if", "you", "run", "a", "command", "that", "causes", "a", "prompt", "and", "then", "try", "to", "read", "/", "print", "the", "stdout", "it", "s", "going", "to", "block", "forever" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L88-L101
[ "def", "exec_command", "(", "self", ",", "cmd", ")", ":", "if", "self", ".", "pwd", "is", "not", "None", ":", "cmd", "=", "'cd %s ; %s'", "%", "(", "self", ".", "pwd", ",", "cmd", ")", "if", "self", ".", "interactive", ":", "print", "(", "cmd", ")", "return", "self", ".", "con", ".", "exec_command", "(", "cmd", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.wait
Execute command and wait for it to finish. Proceed with caution because if you run a command that causes a prompt this will hang
poseidon/ssh.py
def wait(self, cmd, raise_on_error=True): """ Execute command and wait for it to finish. Proceed with caution because if you run a command that causes a prompt this will hang """ _, stdout, stderr = self.exec_command(cmd) stdout.channel.recv_exit_status() output = stdout.read() if self.interactive: print(output) errors = stderr.read() if self.interactive: print(errors) if errors and raise_on_error: raise ValueError(errors) return output
def wait(self, cmd, raise_on_error=True): """ Execute command and wait for it to finish. Proceed with caution because if you run a command that causes a prompt this will hang """ _, stdout, stderr = self.exec_command(cmd) stdout.channel.recv_exit_status() output = stdout.read() if self.interactive: print(output) errors = stderr.read() if self.interactive: print(errors) if errors and raise_on_error: raise ValueError(errors) return output
[ "Execute", "command", "and", "wait", "for", "it", "to", "finish", ".", "Proceed", "with", "caution", "because", "if", "you", "run", "a", "command", "that", "causes", "a", "prompt", "this", "will", "hang" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L103-L118
[ "def", "wait", "(", "self", ",", "cmd", ",", "raise_on_error", "=", "True", ")", ":", "_", ",", "stdout", ",", "stderr", "=", "self", ".", "exec_command", "(", "cmd", ")", "stdout", ".", "channel", ".", "recv_exit_status", "(", ")", "output", "=", "stdout", ".", "read", "(", ")", "if", "self", ".", "interactive", ":", "print", "(", "output", ")", "errors", "=", "stderr", ".", "read", "(", ")", "if", "self", ".", "interactive", ":", "print", "(", "errors", ")", "if", "errors", "and", "raise_on_error", ":", "raise", "ValueError", "(", "errors", ")", "return", "output" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.sudo
Enter sudo mode
poseidon/ssh.py
def sudo(self, password=None): """ Enter sudo mode """ if self.username == 'root': raise ValueError('Already root user') password = self.validate_password(password) stdin, stdout, stderr = self.exec_command('sudo su') stdin.write("%s\n" % password) stdin.flush() errors = stderr.read() if errors: raise ValueError(errors)
def sudo(self, password=None): """ Enter sudo mode """ if self.username == 'root': raise ValueError('Already root user') password = self.validate_password(password) stdin, stdout, stderr = self.exec_command('sudo su') stdin.write("%s\n" % password) stdin.flush() errors = stderr.read() if errors: raise ValueError(errors)
[ "Enter", "sudo", "mode" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L127-L139
[ "def", "sudo", "(", "self", ",", "password", "=", "None", ")", ":", "if", "self", ".", "username", "==", "'root'", ":", "raise", "ValueError", "(", "'Already root user'", ")", "password", "=", "self", ".", "validate_password", "(", "password", ")", "stdin", ",", "stdout", ",", "stderr", "=", "self", ".", "exec_command", "(", "'sudo su'", ")", "stdin", ".", "write", "(", "\"%s\\n\"", "%", "password", ")", "stdin", ".", "flush", "(", ")", "errors", "=", "stderr", ".", "read", "(", ")", "if", "errors", ":", "raise", "ValueError", "(", "errors", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.apt
Install specified packages using apt-get. -y options are automatically used. Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default False If True then raise ValueError if stderr is not empty debconf often gives tty error
poseidon/ssh.py
def apt(self, package_names, raise_on_error=False): """ Install specified packages using apt-get. -y options are automatically used. Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default False If True then raise ValueError if stderr is not empty debconf often gives tty error """ if isinstance(package_names, basestring): package_names = [package_names] cmd = "apt-get install -y %s" % (' '.join(package_names)) return self.wait(cmd, raise_on_error=raise_on_error)
def apt(self, package_names, raise_on_error=False): """ Install specified packages using apt-get. -y options are automatically used. Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default False If True then raise ValueError if stderr is not empty debconf often gives tty error """ if isinstance(package_names, basestring): package_names = [package_names] cmd = "apt-get install -y %s" % (' '.join(package_names)) return self.wait(cmd, raise_on_error=raise_on_error)
[ "Install", "specified", "packages", "using", "apt", "-", "get", ".", "-", "y", "options", "are", "automatically", "used", ".", "Waits", "for", "command", "to", "finish", "." ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L156-L171
[ "def", "apt", "(", "self", ",", "package_names", ",", "raise_on_error", "=", "False", ")", ":", "if", "isinstance", "(", "package_names", ",", "basestring", ")", ":", "package_names", "=", "[", "package_names", "]", "cmd", "=", "\"apt-get install -y %s\"", "%", "(", "' '", ".", "join", "(", "package_names", ")", ")", "return", "self", ".", "wait", "(", "cmd", ",", "raise_on_error", "=", "raise_on_error", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.pip
Install specified python packages using pip. -U option added Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default True If True then raise ValueError if stderr is not empty
poseidon/ssh.py
def pip(self, package_names, raise_on_error=True): """ Install specified python packages using pip. -U option added Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default True If True then raise ValueError if stderr is not empty """ if isinstance(package_names, basestring): package_names = [package_names] cmd = "pip install -U %s" % (' '.join(package_names)) return self.wait(cmd, raise_on_error=raise_on_error)
def pip(self, package_names, raise_on_error=True): """ Install specified python packages using pip. -U option added Waits for command to finish. Parameters ---------- package_names: list-like of str raise_on_error: bool, default True If True then raise ValueError if stderr is not empty """ if isinstance(package_names, basestring): package_names = [package_names] cmd = "pip install -U %s" % (' '.join(package_names)) return self.wait(cmd, raise_on_error=raise_on_error)
[ "Install", "specified", "python", "packages", "using", "pip", ".", "-", "U", "option", "added", "Waits", "for", "command", "to", "finish", "." ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L190-L204
[ "def", "pip", "(", "self", ",", "package_names", ",", "raise_on_error", "=", "True", ")", ":", "if", "isinstance", "(", "package_names", ",", "basestring", ")", ":", "package_names", "=", "[", "package_names", "]", "cmd", "=", "\"pip install -U %s\"", "%", "(", "' '", ".", "join", "(", "package_names", ")", ")", "return", "self", ".", "wait", "(", "cmd", ",", "raise_on_error", "=", "raise_on_error", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.pip_r
Install all requirements contained in the given file path Waits for command to finish. Parameters ---------- requirements: str Path to requirements.txt raise_on_error: bool, default True If True then raise ValueError if stderr is not empty
poseidon/ssh.py
def pip_r(self, requirements, raise_on_error=True): """ Install all requirements contained in the given file path Waits for command to finish. Parameters ---------- requirements: str Path to requirements.txt raise_on_error: bool, default True If True then raise ValueError if stderr is not empty """ cmd = "pip install -r %s" % requirements return self.wait(cmd, raise_on_error=raise_on_error)
def pip_r(self, requirements, raise_on_error=True): """ Install all requirements contained in the given file path Waits for command to finish. Parameters ---------- requirements: str Path to requirements.txt raise_on_error: bool, default True If True then raise ValueError if stderr is not empty """ cmd = "pip install -r %s" % requirements return self.wait(cmd, raise_on_error=raise_on_error)
[ "Install", "all", "requirements", "contained", "in", "the", "given", "file", "path", "Waits", "for", "command", "to", "finish", "." ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L213-L226
[ "def", "pip_r", "(", "self", ",", "requirements", ",", "raise_on_error", "=", "True", ")", ":", "cmd", "=", "\"pip install -r %s\"", "%", "requirements", "return", "self", ".", "wait", "(", "cmd", ",", "raise_on_error", "=", "raise_on_error", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
SSHClient.git
Parameters ---------- token: str, default None Assumes you have GITHUB_TOKEN in envvar if None https://github.com/blog/1270-easier-builds-and-deployments-using-git- over-https-and-oauth
poseidon/ssh.py
def git(self, username, repo, alias=None, token=None): """ Parameters ---------- token: str, default None Assumes you have GITHUB_TOKEN in envvar if None https://github.com/blog/1270-easier-builds-and-deployments-using-git- over-https-and-oauth """ if alias is None: alias = repo if token is None: token = os.environ.get('GITHUB_TOKEN') self.wait('mkdir -p %s' % alias) old_dir = self.pwd try: self.chdir(alias, relative=True) cmd = 'git init && git pull https://%s@github.com/%s/%s.git' # last line to stderr return self.wait(cmd % (token, username, repo), raise_on_error=False) finally: self.chdir(old_dir, relative=False)
def git(self, username, repo, alias=None, token=None): """ Parameters ---------- token: str, default None Assumes you have GITHUB_TOKEN in envvar if None https://github.com/blog/1270-easier-builds-and-deployments-using-git- over-https-and-oauth """ if alias is None: alias = repo if token is None: token = os.environ.get('GITHUB_TOKEN') self.wait('mkdir -p %s' % alias) old_dir = self.pwd try: self.chdir(alias, relative=True) cmd = 'git init && git pull https://%s@github.com/%s/%s.git' # last line to stderr return self.wait(cmd % (token, username, repo), raise_on_error=False) finally: self.chdir(old_dir, relative=False)
[ "Parameters", "----------", "token", ":", "str", "default", "None", "Assumes", "you", "have", "GITHUB_TOKEN", "in", "envvar", "if", "None" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/ssh.py#L262-L285
[ "def", "git", "(", "self", ",", "username", ",", "repo", ",", "alias", "=", "None", ",", "token", "=", "None", ")", ":", "if", "alias", "is", "None", ":", "alias", "=", "repo", "if", "token", "is", "None", ":", "token", "=", "os", ".", "environ", ".", "get", "(", "'GITHUB_TOKEN'", ")", "self", ".", "wait", "(", "'mkdir -p %s'", "%", "alias", ")", "old_dir", "=", "self", ".", "pwd", "try", ":", "self", ".", "chdir", "(", "alias", ",", "relative", "=", "True", ")", "cmd", "=", "'git init && git pull https://%s@github.com/%s/%s.git'", "# last line to stderr", "return", "self", ".", "wait", "(", "cmd", "%", "(", "token", ",", "username", ",", "repo", ")", ",", "raise_on_error", "=", "False", ")", "finally", ":", "self", ".", "chdir", "(", "old_dir", ",", "relative", "=", "False", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
stitch_macro
Create fiji-macros for stitching all channels and z-stacks for a well. Parameters ---------- path : string Well path. output_folder : string Folder to store images. If not given well path is used. Returns ------- output_files, macros : tuple Tuple with filenames and macros for stitched well.
leicaexperiment/experiment.py
def stitch_macro(path, output_folder=None): """Create fiji-macros for stitching all channels and z-stacks for a well. Parameters ---------- path : string Well path. output_folder : string Folder to store images. If not given well path is used. Returns ------- output_files, macros : tuple Tuple with filenames and macros for stitched well. """ output_folder = output_folder or path debug('stitching ' + path + ' to ' + output_folder) fields = glob(_pattern(path, _field)) # assume we have rectangle of fields xs = [attribute(field, 'X') for field in fields] ys = [attribute(field, 'Y') for field in fields] x_min, x_max = min(xs), max(xs) y_min, y_max = min(ys), max(ys) fields_column = len(set(xs)) fields_row = len(set(ys)) # assume all fields are the same # and get properties from images in first field images = glob(_pattern(fields[0], _image)) # assume attributes are the same on all images attr = attributes(images[0]) # find all channels and z-stacks channels = [] z_stacks = [] for image in images: channel = attribute_as_str(image, 'C') if channel not in channels: channels.append(channel) z = attribute_as_str(image, 'Z') if z not in z_stacks: z_stacks.append(z) debug('channels ' + str(channels)) debug('z-stacks ' + str(z_stacks)) # create macro _, extension = os.path.splitext(images[-1]) if extension == '.tif': # assume .ome.tif extension = '.ome.tif' macros = [] output_files = [] for Z in z_stacks: for C in channels: filenames = os.path.join( _field + '--X{xx}--Y{yy}', _image + '--L' + attr.L + '--S' + attr.S + '--U' + attr.U + '--V' + attr.V + '--J' + attr.J + '--E' + attr.E + '--O' + attr.O + '--X{xx}--Y{yy}' + '--T' + attr.T + '--Z' + Z + '--C' + C + extension) debug('filenames ' + filenames) cur_attr = attributes(filenames)._asdict() f = 'stitched--U{U}--V{V}--C{C}--Z{Z}.png'.format(**cur_attr) output = os.path.join(output_folder, f) debug('output ' + output) output_files.append(output) if os.path.isfile(output): # file already exists print('leicaexperiment stitched file already' ' exists {}'.format(output)) continue macros.append(fijibin.macro.stitch(path, filenames, fields_column, fields_row, output_filename=output, x_start=x_min, y_start=y_min)) return (output_files, macros)
def stitch_macro(path, output_folder=None): """Create fiji-macros for stitching all channels and z-stacks for a well. Parameters ---------- path : string Well path. output_folder : string Folder to store images. If not given well path is used. Returns ------- output_files, macros : tuple Tuple with filenames and macros for stitched well. """ output_folder = output_folder or path debug('stitching ' + path + ' to ' + output_folder) fields = glob(_pattern(path, _field)) # assume we have rectangle of fields xs = [attribute(field, 'X') for field in fields] ys = [attribute(field, 'Y') for field in fields] x_min, x_max = min(xs), max(xs) y_min, y_max = min(ys), max(ys) fields_column = len(set(xs)) fields_row = len(set(ys)) # assume all fields are the same # and get properties from images in first field images = glob(_pattern(fields[0], _image)) # assume attributes are the same on all images attr = attributes(images[0]) # find all channels and z-stacks channels = [] z_stacks = [] for image in images: channel = attribute_as_str(image, 'C') if channel not in channels: channels.append(channel) z = attribute_as_str(image, 'Z') if z not in z_stacks: z_stacks.append(z) debug('channels ' + str(channels)) debug('z-stacks ' + str(z_stacks)) # create macro _, extension = os.path.splitext(images[-1]) if extension == '.tif': # assume .ome.tif extension = '.ome.tif' macros = [] output_files = [] for Z in z_stacks: for C in channels: filenames = os.path.join( _field + '--X{xx}--Y{yy}', _image + '--L' + attr.L + '--S' + attr.S + '--U' + attr.U + '--V' + attr.V + '--J' + attr.J + '--E' + attr.E + '--O' + attr.O + '--X{xx}--Y{yy}' + '--T' + attr.T + '--Z' + Z + '--C' + C + extension) debug('filenames ' + filenames) cur_attr = attributes(filenames)._asdict() f = 'stitched--U{U}--V{V}--C{C}--Z{Z}.png'.format(**cur_attr) output = os.path.join(output_folder, f) debug('output ' + output) output_files.append(output) if os.path.isfile(output): # file already exists print('leicaexperiment stitched file already' ' exists {}'.format(output)) continue macros.append(fijibin.macro.stitch(path, filenames, fields_column, fields_row, output_filename=output, x_start=x_min, y_start=y_min)) return (output_files, macros)
[ "Create", "fiji", "-", "macros", "for", "stitching", "all", "channels", "and", "z", "-", "stacks", "for", "a", "well", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L374-L466
[ "def", "stitch_macro", "(", "path", ",", "output_folder", "=", "None", ")", ":", "output_folder", "=", "output_folder", "or", "path", "debug", "(", "'stitching '", "+", "path", "+", "' to '", "+", "output_folder", ")", "fields", "=", "glob", "(", "_pattern", "(", "path", ",", "_field", ")", ")", "# assume we have rectangle of fields", "xs", "=", "[", "attribute", "(", "field", ",", "'X'", ")", "for", "field", "in", "fields", "]", "ys", "=", "[", "attribute", "(", "field", ",", "'Y'", ")", "for", "field", "in", "fields", "]", "x_min", ",", "x_max", "=", "min", "(", "xs", ")", ",", "max", "(", "xs", ")", "y_min", ",", "y_max", "=", "min", "(", "ys", ")", ",", "max", "(", "ys", ")", "fields_column", "=", "len", "(", "set", "(", "xs", ")", ")", "fields_row", "=", "len", "(", "set", "(", "ys", ")", ")", "# assume all fields are the same", "# and get properties from images in first field", "images", "=", "glob", "(", "_pattern", "(", "fields", "[", "0", "]", ",", "_image", ")", ")", "# assume attributes are the same on all images", "attr", "=", "attributes", "(", "images", "[", "0", "]", ")", "# find all channels and z-stacks", "channels", "=", "[", "]", "z_stacks", "=", "[", "]", "for", "image", "in", "images", ":", "channel", "=", "attribute_as_str", "(", "image", ",", "'C'", ")", "if", "channel", "not", "in", "channels", ":", "channels", ".", "append", "(", "channel", ")", "z", "=", "attribute_as_str", "(", "image", ",", "'Z'", ")", "if", "z", "not", "in", "z_stacks", ":", "z_stacks", ".", "append", "(", "z", ")", "debug", "(", "'channels '", "+", "str", "(", "channels", ")", ")", "debug", "(", "'z-stacks '", "+", "str", "(", "z_stacks", ")", ")", "# create macro", "_", ",", "extension", "=", "os", ".", "path", ".", "splitext", "(", "images", "[", "-", "1", "]", ")", "if", "extension", "==", "'.tif'", ":", "# assume .ome.tif", "extension", "=", "'.ome.tif'", "macros", "=", "[", "]", "output_files", "=", "[", "]", "for", "Z", "in", "z_stacks", ":", "for", "C", "in", "channels", ":", "filenames", "=", "os", ".", "path", ".", "join", "(", "_field", "+", "'--X{xx}--Y{yy}'", ",", "_image", "+", "'--L'", "+", "attr", ".", "L", "+", "'--S'", "+", "attr", ".", "S", "+", "'--U'", "+", "attr", ".", "U", "+", "'--V'", "+", "attr", ".", "V", "+", "'--J'", "+", "attr", ".", "J", "+", "'--E'", "+", "attr", ".", "E", "+", "'--O'", "+", "attr", ".", "O", "+", "'--X{xx}--Y{yy}'", "+", "'--T'", "+", "attr", ".", "T", "+", "'--Z'", "+", "Z", "+", "'--C'", "+", "C", "+", "extension", ")", "debug", "(", "'filenames '", "+", "filenames", ")", "cur_attr", "=", "attributes", "(", "filenames", ")", ".", "_asdict", "(", ")", "f", "=", "'stitched--U{U}--V{V}--C{C}--Z{Z}.png'", ".", "format", "(", "*", "*", "cur_attr", ")", "output", "=", "os", ".", "path", ".", "join", "(", "output_folder", ",", "f", ")", "debug", "(", "'output '", "+", "output", ")", "output_files", ".", "append", "(", "output", ")", "if", "os", ".", "path", ".", "isfile", "(", "output", ")", ":", "# file already exists", "print", "(", "'leicaexperiment stitched file already'", "' exists {}'", ".", "format", "(", "output", ")", ")", "continue", "macros", ".", "append", "(", "fijibin", ".", "macro", ".", "stitch", "(", "path", ",", "filenames", ",", "fields_column", ",", "fields_row", ",", "output_filename", "=", "output", ",", "x_start", "=", "x_min", ",", "y_start", "=", "y_min", ")", ")", "return", "(", "output_files", ",", "macros", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
compress
Lossless compression. Save images as PNG and TIFF tags to json. Can be reversed with `decompress`. Will run in multiprocessing, where number of workers is decided by ``leicaexperiment.experiment._pools``. Parameters ---------- images : list of filenames Images to lossless compress. delete_tif : bool Wheter to delete original images. folder : string Where to store images. Basename will be kept. Returns ------- list of filenames List of compressed files.
leicaexperiment/experiment.py
def compress(images, delete_tif=False, folder=None): """Lossless compression. Save images as PNG and TIFF tags to json. Can be reversed with `decompress`. Will run in multiprocessing, where number of workers is decided by ``leicaexperiment.experiment._pools``. Parameters ---------- images : list of filenames Images to lossless compress. delete_tif : bool Wheter to delete original images. folder : string Where to store images. Basename will be kept. Returns ------- list of filenames List of compressed files. """ if type(images) == str: # only one image return [compress_blocking(images, delete_tif, folder)] filenames = copy(images) # as images property will change when looping return Parallel(n_jobs=_pools)(delayed(compress_blocking) (image=image, delete_tif=delete_tif, folder=folder) for image in filenames)
def compress(images, delete_tif=False, folder=None): """Lossless compression. Save images as PNG and TIFF tags to json. Can be reversed with `decompress`. Will run in multiprocessing, where number of workers is decided by ``leicaexperiment.experiment._pools``. Parameters ---------- images : list of filenames Images to lossless compress. delete_tif : bool Wheter to delete original images. folder : string Where to store images. Basename will be kept. Returns ------- list of filenames List of compressed files. """ if type(images) == str: # only one image return [compress_blocking(images, delete_tif, folder)] filenames = copy(images) # as images property will change when looping return Parallel(n_jobs=_pools)(delayed(compress_blocking) (image=image, delete_tif=delete_tif, folder=folder) for image in filenames)
[ "Lossless", "compression", ".", "Save", "images", "as", "PNG", "and", "TIFF", "tags", "to", "json", ".", "Can", "be", "reversed", "with", "decompress", ".", "Will", "run", "in", "multiprocessing", "where", "number", "of", "workers", "is", "decided", "by", "leicaexperiment", ".", "experiment", ".", "_pools", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L469-L497
[ "def", "compress", "(", "images", ",", "delete_tif", "=", "False", ",", "folder", "=", "None", ")", ":", "if", "type", "(", "images", ")", "==", "str", ":", "# only one image", "return", "[", "compress_blocking", "(", "images", ",", "delete_tif", ",", "folder", ")", "]", "filenames", "=", "copy", "(", "images", ")", "# as images property will change when looping", "return", "Parallel", "(", "n_jobs", "=", "_pools", ")", "(", "delayed", "(", "compress_blocking", ")", "(", "image", "=", "image", ",", "delete_tif", "=", "delete_tif", ",", "folder", "=", "folder", ")", "for", "image", "in", "filenames", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
compress_blocking
Lossless compression. Save image as PNG and TIFF tags to json. Process can be reversed with `decompress`. Parameters ---------- image : string TIF-image which should be compressed lossless. delete_tif : bool Wheter to delete original images. force : bool Wheter to compress even if .png already exists. Returns ------- string Filename of compressed image, or empty string if compress failed.
leicaexperiment/experiment.py
def compress_blocking(image, delete_tif=False, folder=None, force=False): """Lossless compression. Save image as PNG and TIFF tags to json. Process can be reversed with `decompress`. Parameters ---------- image : string TIF-image which should be compressed lossless. delete_tif : bool Wheter to delete original images. force : bool Wheter to compress even if .png already exists. Returns ------- string Filename of compressed image, or empty string if compress failed. """ debug('compressing {}'.format(image)) try: new_filename, extension = os.path.splitext(image) # remove last occurrence of .ome new_filename = new_filename.rsplit('.ome', 1)[0] # if compressed file should be put in specified folder if folder: basename = os.path.basename(new_filename) new_filename = os.path.join(folder, basename + '.png') else: new_filename = new_filename + '.png' # check if png exists if os.path.isfile(new_filename) and not force: compressed_images.append(new_filename) msg = "Aborting compress, PNG already" \ " exists: {}".format(new_filename) raise AssertionError(msg) if extension != '.tif': msg = "Aborting compress, not a TIFF: {}".format(image) raise AssertionError(msg) # open image, load and close file pointer img = Image.open(image) fptr = img.fp # keep file pointer, for closing img.load() # load img-data before switching mode, also closes fp # get tags and save them as json tags = img.tag.as_dict() with open(new_filename[:-4] + '.json', 'w') as f: if img.mode == 'P': # keep palette tags['palette'] = img.getpalette() json.dump(tags, f) # check if image is palette-mode if img.mode == 'P': # switch to luminance to keep data intact debug('palette-mode switched to luminance') img.mode = 'L' if img.mode == 'I;16': # https://github.com/python-pillow/Pillow/issues/1099 img = img.convert(mode='I') # compress/save debug('saving to {}'.format(new_filename)) img.save(new_filename) fptr.close() # windows bug Pillow if delete_tif: os.remove(image) except (IOError, AssertionError) as e: # print error - continue print('leicaexperiment {}'.format(e)) return '' return new_filename
def compress_blocking(image, delete_tif=False, folder=None, force=False): """Lossless compression. Save image as PNG and TIFF tags to json. Process can be reversed with `decompress`. Parameters ---------- image : string TIF-image which should be compressed lossless. delete_tif : bool Wheter to delete original images. force : bool Wheter to compress even if .png already exists. Returns ------- string Filename of compressed image, or empty string if compress failed. """ debug('compressing {}'.format(image)) try: new_filename, extension = os.path.splitext(image) # remove last occurrence of .ome new_filename = new_filename.rsplit('.ome', 1)[0] # if compressed file should be put in specified folder if folder: basename = os.path.basename(new_filename) new_filename = os.path.join(folder, basename + '.png') else: new_filename = new_filename + '.png' # check if png exists if os.path.isfile(new_filename) and not force: compressed_images.append(new_filename) msg = "Aborting compress, PNG already" \ " exists: {}".format(new_filename) raise AssertionError(msg) if extension != '.tif': msg = "Aborting compress, not a TIFF: {}".format(image) raise AssertionError(msg) # open image, load and close file pointer img = Image.open(image) fptr = img.fp # keep file pointer, for closing img.load() # load img-data before switching mode, also closes fp # get tags and save them as json tags = img.tag.as_dict() with open(new_filename[:-4] + '.json', 'w') as f: if img.mode == 'P': # keep palette tags['palette'] = img.getpalette() json.dump(tags, f) # check if image is palette-mode if img.mode == 'P': # switch to luminance to keep data intact debug('palette-mode switched to luminance') img.mode = 'L' if img.mode == 'I;16': # https://github.com/python-pillow/Pillow/issues/1099 img = img.convert(mode='I') # compress/save debug('saving to {}'.format(new_filename)) img.save(new_filename) fptr.close() # windows bug Pillow if delete_tif: os.remove(image) except (IOError, AssertionError) as e: # print error - continue print('leicaexperiment {}'.format(e)) return '' return new_filename
[ "Lossless", "compression", ".", "Save", "image", "as", "PNG", "and", "TIFF", "tags", "to", "json", ".", "Process", "can", "be", "reversed", "with", "decompress", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L500-L577
[ "def", "compress_blocking", "(", "image", ",", "delete_tif", "=", "False", ",", "folder", "=", "None", ",", "force", "=", "False", ")", ":", "debug", "(", "'compressing {}'", ".", "format", "(", "image", ")", ")", "try", ":", "new_filename", ",", "extension", "=", "os", ".", "path", ".", "splitext", "(", "image", ")", "# remove last occurrence of .ome", "new_filename", "=", "new_filename", ".", "rsplit", "(", "'.ome'", ",", "1", ")", "[", "0", "]", "# if compressed file should be put in specified folder", "if", "folder", ":", "basename", "=", "os", ".", "path", ".", "basename", "(", "new_filename", ")", "new_filename", "=", "os", ".", "path", ".", "join", "(", "folder", ",", "basename", "+", "'.png'", ")", "else", ":", "new_filename", "=", "new_filename", "+", "'.png'", "# check if png exists", "if", "os", ".", "path", ".", "isfile", "(", "new_filename", ")", "and", "not", "force", ":", "compressed_images", ".", "append", "(", "new_filename", ")", "msg", "=", "\"Aborting compress, PNG already\"", "\" exists: {}\"", ".", "format", "(", "new_filename", ")", "raise", "AssertionError", "(", "msg", ")", "if", "extension", "!=", "'.tif'", ":", "msg", "=", "\"Aborting compress, not a TIFF: {}\"", ".", "format", "(", "image", ")", "raise", "AssertionError", "(", "msg", ")", "# open image, load and close file pointer", "img", "=", "Image", ".", "open", "(", "image", ")", "fptr", "=", "img", ".", "fp", "# keep file pointer, for closing", "img", ".", "load", "(", ")", "# load img-data before switching mode, also closes fp", "# get tags and save them as json", "tags", "=", "img", ".", "tag", ".", "as_dict", "(", ")", "with", "open", "(", "new_filename", "[", ":", "-", "4", "]", "+", "'.json'", ",", "'w'", ")", "as", "f", ":", "if", "img", ".", "mode", "==", "'P'", ":", "# keep palette", "tags", "[", "'palette'", "]", "=", "img", ".", "getpalette", "(", ")", "json", ".", "dump", "(", "tags", ",", "f", ")", "# check if image is palette-mode", "if", "img", ".", "mode", "==", "'P'", ":", "# switch to luminance to keep data intact", "debug", "(", "'palette-mode switched to luminance'", ")", "img", ".", "mode", "=", "'L'", "if", "img", ".", "mode", "==", "'I;16'", ":", "# https://github.com/python-pillow/Pillow/issues/1099", "img", "=", "img", ".", "convert", "(", "mode", "=", "'I'", ")", "# compress/save", "debug", "(", "'saving to {}'", ".", "format", "(", "new_filename", ")", ")", "img", ".", "save", "(", "new_filename", ")", "fptr", ".", "close", "(", ")", "# windows bug Pillow", "if", "delete_tif", ":", "os", ".", "remove", "(", "image", ")", "except", "(", "IOError", ",", "AssertionError", ")", "as", "e", ":", "# print error - continue", "print", "(", "'leicaexperiment {}'", ".", "format", "(", "e", ")", ")", "return", "''", "return", "new_filename" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
decompress
Reverse compression from tif to png and save them in original format (ome.tif). TIFF-tags are gotten from json-files named the same as given images. Parameters ---------- images : list of filenames Image to decompress. delete_png : bool Wheter to delete PNG images. delete_json : bool Wheter to delete TIFF-tags stored in json files on compress. Returns ------- list of filenames List of decompressed files.
leicaexperiment/experiment.py
def decompress(images, delete_png=False, delete_json=False, folder=None): """Reverse compression from tif to png and save them in original format (ome.tif). TIFF-tags are gotten from json-files named the same as given images. Parameters ---------- images : list of filenames Image to decompress. delete_png : bool Wheter to delete PNG images. delete_json : bool Wheter to delete TIFF-tags stored in json files on compress. Returns ------- list of filenames List of decompressed files. """ if type(images) == str: # only one image return decompress([images]) filenames = copy(images) # as images property will change when looping decompressed_images = [] for orig_filename in filenames: debug('decompressing {}'.format(orig_filename)) try: filename, extension = os.path.splitext(orig_filename) # if decompressed file should be put in specified folder if folder: basename = os.path.basename(filename) new_filename = os.path.join(folder, basename + '.ome.tif') else: new_filename = filename + '.ome.tif' # check if tif exists if os.path.isfile(new_filename): decompressed_images.append(new_filename) msg = "Aborting decompress, TIFF already exists:" \ " {}".format(orig_filename) raise AssertionError(msg) if extension != '.png': msg = "Aborting decompress, not a " \ "PNG: {}".format(orig_filename) raise AssertionError(msg) # open image, load and close file pointer img = Image.open(orig_filename) img.load() # load img-data before switching mode, also closes fp # get tags from json info = {} with open(filename + '.json', 'r') as f: tags = json.load(f) # convert dictionary to original types (lost in json conversion) for tag,val in tags.items(): if tag == 'palette': # hack hack continue if type(val) == list: val = tuple(val) if type(val[0]) == list: # list of list val = tuple(tuple(x) for x in val) info[int(tag)] = val # check for color map if 'palette' in tags: img.putpalette(tags['palette']) # save as tif debug('saving to {}'.format(new_filename)) img.save(new_filename, tiffinfo=info) decompressed_images.append(new_filename) if delete_png: os.remove(orig_filename) if delete_json: os.remove(filename + '.json') except (IOError, AssertionError) as e: # print error - continue print('leicaexperiment {}'.format(e)) return decompressed_images
def decompress(images, delete_png=False, delete_json=False, folder=None): """Reverse compression from tif to png and save them in original format (ome.tif). TIFF-tags are gotten from json-files named the same as given images. Parameters ---------- images : list of filenames Image to decompress. delete_png : bool Wheter to delete PNG images. delete_json : bool Wheter to delete TIFF-tags stored in json files on compress. Returns ------- list of filenames List of decompressed files. """ if type(images) == str: # only one image return decompress([images]) filenames = copy(images) # as images property will change when looping decompressed_images = [] for orig_filename in filenames: debug('decompressing {}'.format(orig_filename)) try: filename, extension = os.path.splitext(orig_filename) # if decompressed file should be put in specified folder if folder: basename = os.path.basename(filename) new_filename = os.path.join(folder, basename + '.ome.tif') else: new_filename = filename + '.ome.tif' # check if tif exists if os.path.isfile(new_filename): decompressed_images.append(new_filename) msg = "Aborting decompress, TIFF already exists:" \ " {}".format(orig_filename) raise AssertionError(msg) if extension != '.png': msg = "Aborting decompress, not a " \ "PNG: {}".format(orig_filename) raise AssertionError(msg) # open image, load and close file pointer img = Image.open(orig_filename) img.load() # load img-data before switching mode, also closes fp # get tags from json info = {} with open(filename + '.json', 'r') as f: tags = json.load(f) # convert dictionary to original types (lost in json conversion) for tag,val in tags.items(): if tag == 'palette': # hack hack continue if type(val) == list: val = tuple(val) if type(val[0]) == list: # list of list val = tuple(tuple(x) for x in val) info[int(tag)] = val # check for color map if 'palette' in tags: img.putpalette(tags['palette']) # save as tif debug('saving to {}'.format(new_filename)) img.save(new_filename, tiffinfo=info) decompressed_images.append(new_filename) if delete_png: os.remove(orig_filename) if delete_json: os.remove(filename + '.json') except (IOError, AssertionError) as e: # print error - continue print('leicaexperiment {}'.format(e)) return decompressed_images
[ "Reverse", "compression", "from", "tif", "to", "png", "and", "save", "them", "in", "original", "format", "(", "ome", ".", "tif", ")", ".", "TIFF", "-", "tags", "are", "gotten", "from", "json", "-", "files", "named", "the", "same", "as", "given", "images", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L581-L669
[ "def", "decompress", "(", "images", ",", "delete_png", "=", "False", ",", "delete_json", "=", "False", ",", "folder", "=", "None", ")", ":", "if", "type", "(", "images", ")", "==", "str", ":", "# only one image", "return", "decompress", "(", "[", "images", "]", ")", "filenames", "=", "copy", "(", "images", ")", "# as images property will change when looping", "decompressed_images", "=", "[", "]", "for", "orig_filename", "in", "filenames", ":", "debug", "(", "'decompressing {}'", ".", "format", "(", "orig_filename", ")", ")", "try", ":", "filename", ",", "extension", "=", "os", ".", "path", ".", "splitext", "(", "orig_filename", ")", "# if decompressed file should be put in specified folder", "if", "folder", ":", "basename", "=", "os", ".", "path", ".", "basename", "(", "filename", ")", "new_filename", "=", "os", ".", "path", ".", "join", "(", "folder", ",", "basename", "+", "'.ome.tif'", ")", "else", ":", "new_filename", "=", "filename", "+", "'.ome.tif'", "# check if tif exists", "if", "os", ".", "path", ".", "isfile", "(", "new_filename", ")", ":", "decompressed_images", ".", "append", "(", "new_filename", ")", "msg", "=", "\"Aborting decompress, TIFF already exists:\"", "\" {}\"", ".", "format", "(", "orig_filename", ")", "raise", "AssertionError", "(", "msg", ")", "if", "extension", "!=", "'.png'", ":", "msg", "=", "\"Aborting decompress, not a \"", "\"PNG: {}\"", ".", "format", "(", "orig_filename", ")", "raise", "AssertionError", "(", "msg", ")", "# open image, load and close file pointer", "img", "=", "Image", ".", "open", "(", "orig_filename", ")", "img", ".", "load", "(", ")", "# load img-data before switching mode, also closes fp", "# get tags from json", "info", "=", "{", "}", "with", "open", "(", "filename", "+", "'.json'", ",", "'r'", ")", "as", "f", ":", "tags", "=", "json", ".", "load", "(", "f", ")", "# convert dictionary to original types (lost in json conversion)", "for", "tag", ",", "val", "in", "tags", ".", "items", "(", ")", ":", "if", "tag", "==", "'palette'", ":", "# hack hack", "continue", "if", "type", "(", "val", ")", "==", "list", ":", "val", "=", "tuple", "(", "val", ")", "if", "type", "(", "val", "[", "0", "]", ")", "==", "list", ":", "# list of list", "val", "=", "tuple", "(", "tuple", "(", "x", ")", "for", "x", "in", "val", ")", "info", "[", "int", "(", "tag", ")", "]", "=", "val", "# check for color map", "if", "'palette'", "in", "tags", ":", "img", ".", "putpalette", "(", "tags", "[", "'palette'", "]", ")", "# save as tif", "debug", "(", "'saving to {}'", ".", "format", "(", "new_filename", ")", ")", "img", ".", "save", "(", "new_filename", ",", "tiffinfo", "=", "info", ")", "decompressed_images", ".", "append", "(", "new_filename", ")", "if", "delete_png", ":", "os", ".", "remove", "(", "orig_filename", ")", "if", "delete_json", ":", "os", ".", "remove", "(", "filename", "+", "'.json'", ")", "except", "(", "IOError", ",", "AssertionError", ")", "as", "e", ":", "# print error - continue", "print", "(", "'leicaexperiment {}'", ".", "format", "(", "e", ")", ")", "return", "decompressed_images" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
attribute
Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- integer Returns number found in path behind --name as an integer.
leicaexperiment/experiment.py
def attribute(path, name): """Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- integer Returns number found in path behind --name as an integer. """ matches = re.findall('--' + name.upper() + '([0-9]{2})', path) if matches: return int(matches[-1]) else: return None
def attribute(path, name): """Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- integer Returns number found in path behind --name as an integer. """ matches = re.findall('--' + name.upper() + '([0-9]{2})', path) if matches: return int(matches[-1]) else: return None
[ "Returns", "the", "two", "numbers", "found", "behind", "--", "[", "A", "-", "Z", "]", "in", "path", ".", "If", "several", "matches", "are", "found", "the", "last", "one", "is", "returned", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L672-L693
[ "def", "attribute", "(", "path", ",", "name", ")", ":", "matches", "=", "re", ".", "findall", "(", "'--'", "+", "name", ".", "upper", "(", ")", "+", "'([0-9]{2})'", ",", "path", ")", "if", "matches", ":", "return", "int", "(", "matches", "[", "-", "1", "]", ")", "else", ":", "return", "None" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
attribute_as_str
Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- string Returns two digit number found in path behind --name.
leicaexperiment/experiment.py
def attribute_as_str(path, name): """Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- string Returns two digit number found in path behind --name. """ matches = re.findall('--' + name.upper() + '([0-9]{2})', path) if matches: return matches[-1] else: return None
def attribute_as_str(path, name): """Returns the two numbers found behind --[A-Z] in path. If several matches are found, the last one is returned. Parameters ---------- path : string String with path of file/folder to get attribute from. name : string Name of attribute to get. Should be A-Z or a-z (implicit converted to uppercase). Returns ------- string Returns two digit number found in path behind --name. """ matches = re.findall('--' + name.upper() + '([0-9]{2})', path) if matches: return matches[-1] else: return None
[ "Returns", "the", "two", "numbers", "found", "behind", "--", "[", "A", "-", "Z", "]", "in", "path", ".", "If", "several", "matches", "are", "found", "the", "last", "one", "is", "returned", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L696-L717
[ "def", "attribute_as_str", "(", "path", ",", "name", ")", ":", "matches", "=", "re", ".", "findall", "(", "'--'", "+", "name", ".", "upper", "(", ")", "+", "'([0-9]{2})'", ",", "path", ")", "if", "matches", ":", "return", "matches", "[", "-", "1", "]", "else", ":", "return", "None" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
attributes
Get attributes from path based on format --[A-Z]. Returns namedtuple with upper case attributes equal to what found in path (string) and lower case as int. If path holds several occurrences of same character, only the last one is kept. >>> attrs = attributes('/folder/file--X00-X01.tif') >>> print(attrs) namedtuple('attributes', 'X x')('01', 1) >>> print(attrs.x) 1 Parameters ---------- path : string Returns ------- collections.namedtuple
leicaexperiment/experiment.py
def attributes(path): """Get attributes from path based on format --[A-Z]. Returns namedtuple with upper case attributes equal to what found in path (string) and lower case as int. If path holds several occurrences of same character, only the last one is kept. >>> attrs = attributes('/folder/file--X00-X01.tif') >>> print(attrs) namedtuple('attributes', 'X x')('01', 1) >>> print(attrs.x) 1 Parameters ---------- path : string Returns ------- collections.namedtuple """ # number of charcters set to numbers have changed in LAS AF X !! matches = re.findall('--([A-Z]{1})([0-9]{2,4})', path) keys = [] values = [] for k,v in matches: if k in keys: # keep only last key i = keys.index(k) del keys[i] del values[i] keys.append(k) values.append(v) lower_keys = [k.lower() for k in keys] int_values= [int(v) for v in values] attributes = namedtuple('attributes', keys + lower_keys) return attributes(*values + int_values)
def attributes(path): """Get attributes from path based on format --[A-Z]. Returns namedtuple with upper case attributes equal to what found in path (string) and lower case as int. If path holds several occurrences of same character, only the last one is kept. >>> attrs = attributes('/folder/file--X00-X01.tif') >>> print(attrs) namedtuple('attributes', 'X x')('01', 1) >>> print(attrs.x) 1 Parameters ---------- path : string Returns ------- collections.namedtuple """ # number of charcters set to numbers have changed in LAS AF X !! matches = re.findall('--([A-Z]{1})([0-9]{2,4})', path) keys = [] values = [] for k,v in matches: if k in keys: # keep only last key i = keys.index(k) del keys[i] del values[i] keys.append(k) values.append(v) lower_keys = [k.lower() for k in keys] int_values= [int(v) for v in values] attributes = namedtuple('attributes', keys + lower_keys) return attributes(*values + int_values)
[ "Get", "attributes", "from", "path", "based", "on", "format", "--", "[", "A", "-", "Z", "]", ".", "Returns", "namedtuple", "with", "upper", "case", "attributes", "equal", "to", "what", "found", "in", "path", "(", "string", ")", "and", "lower", "case", "as", "int", ".", "If", "path", "holds", "several", "occurrences", "of", "same", "character", "only", "the", "last", "one", "is", "kept", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L719-L758
[ "def", "attributes", "(", "path", ")", ":", "# number of charcters set to numbers have changed in LAS AF X !!", "matches", "=", "re", ".", "findall", "(", "'--([A-Z]{1})([0-9]{2,4})'", ",", "path", ")", "keys", "=", "[", "]", "values", "=", "[", "]", "for", "k", ",", "v", "in", "matches", ":", "if", "k", "in", "keys", ":", "# keep only last key", "i", "=", "keys", ".", "index", "(", "k", ")", "del", "keys", "[", "i", "]", "del", "values", "[", "i", "]", "keys", ".", "append", "(", "k", ")", "values", ".", "append", "(", "v", ")", "lower_keys", "=", "[", "k", ".", "lower", "(", ")", "for", "k", "in", "keys", "]", "int_values", "=", "[", "int", "(", "v", ")", "for", "v", "in", "values", "]", "attributes", "=", "namedtuple", "(", "'attributes'", ",", "keys", "+", "lower_keys", ")", "return", "attributes", "(", "*", "values", "+", "int_values", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
_pattern
Returns globbing pattern for name1/name2/../lastname + '--*' or name1/name2/../lastname + extension if parameter `extension` it set. Parameters ---------- names : strings Which path to join. Example: _pattern('path', 'to', 'experiment') will return `path/to/experiment--*`. extension : string If other extension then --* is wanted. Example: _pattern('path', 'to', 'image', extension='*.png') will return `path/to/image*.png`. Returns ------- string Joined glob pattern string.
leicaexperiment/experiment.py
def _pattern(*names, **kwargs): """Returns globbing pattern for name1/name2/../lastname + '--*' or name1/name2/../lastname + extension if parameter `extension` it set. Parameters ---------- names : strings Which path to join. Example: _pattern('path', 'to', 'experiment') will return `path/to/experiment--*`. extension : string If other extension then --* is wanted. Example: _pattern('path', 'to', 'image', extension='*.png') will return `path/to/image*.png`. Returns ------- string Joined glob pattern string. """ if 'extension' not in kwargs: kwargs['extension'] = '--*' return os.path.join(*names) + kwargs['extension']
def _pattern(*names, **kwargs): """Returns globbing pattern for name1/name2/../lastname + '--*' or name1/name2/../lastname + extension if parameter `extension` it set. Parameters ---------- names : strings Which path to join. Example: _pattern('path', 'to', 'experiment') will return `path/to/experiment--*`. extension : string If other extension then --* is wanted. Example: _pattern('path', 'to', 'image', extension='*.png') will return `path/to/image*.png`. Returns ------- string Joined glob pattern string. """ if 'extension' not in kwargs: kwargs['extension'] = '--*' return os.path.join(*names) + kwargs['extension']
[ "Returns", "globbing", "pattern", "for", "name1", "/", "name2", "/", "..", "/", "lastname", "+", "--", "*", "or", "name1", "/", "name2", "/", "..", "/", "lastname", "+", "extension", "if", "parameter", "extension", "it", "set", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L763-L784
[ "def", "_pattern", "(", "*", "names", ",", "*", "*", "kwargs", ")", ":", "if", "'extension'", "not", "in", "kwargs", ":", "kwargs", "[", "'extension'", "]", "=", "'--*'", "return", "os", ".", "path", ".", "join", "(", "*", "names", ")", "+", "kwargs", "[", "'extension'", "]" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
_set_path
Set self.path, self.dirname and self.basename.
leicaexperiment/experiment.py
def _set_path(self, path): "Set self.path, self.dirname and self.basename." import os.path self.path = os.path.abspath(path) self.dirname = os.path.dirname(path) self.basename = os.path.basename(path)
def _set_path(self, path): "Set self.path, self.dirname and self.basename." import os.path self.path = os.path.abspath(path) self.dirname = os.path.dirname(path) self.basename = os.path.basename(path)
[ "Set", "self", ".", "path", "self", ".", "dirname", "and", "self", ".", "basename", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L787-L792
[ "def", "_set_path", "(", "self", ",", "path", ")", ":", "import", "os", ".", "path", "self", ".", "path", "=", "os", ".", "path", ".", "abspath", "(", "path", ")", "self", ".", "dirname", "=", "os", ".", "path", ".", "dirname", "(", "path", ")", "self", ".", "basename", "=", "os", ".", "path", ".", "basename", "(", "path", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.images
List of paths to images.
leicaexperiment/experiment.py
def images(self): "List of paths to images." tifs = _pattern(self._image_path, extension='tif') pngs = _pattern(self._image_path, extension='png') imgs = [] imgs.extend(glob(tifs)) imgs.extend(glob(pngs)) return imgs
def images(self): "List of paths to images." tifs = _pattern(self._image_path, extension='tif') pngs = _pattern(self._image_path, extension='png') imgs = [] imgs.extend(glob(tifs)) imgs.extend(glob(pngs)) return imgs
[ "List", "of", "paths", "to", "images", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L96-L103
[ "def", "images", "(", "self", ")", ":", "tifs", "=", "_pattern", "(", "self", ".", "_image_path", ",", "extension", "=", "'tif'", ")", "pngs", "=", "_pattern", "(", "self", ".", "_image_path", ",", "extension", "=", "'png'", ")", "imgs", "=", "[", "]", "imgs", ".", "extend", "(", "glob", "(", "tifs", ")", ")", "imgs", ".", "extend", "(", "glob", "(", "pngs", ")", ")", "return", "imgs" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.scanning_template
Path to {ScanningTemplate}name.xml of experiment.
leicaexperiment/experiment.py
def scanning_template(self): "Path to {ScanningTemplate}name.xml of experiment." tmpl = glob(_pattern(self.path, _additional_data, _scanning_template, extension='*.xml')) if tmpl: return tmpl[0] else: return ''
def scanning_template(self): "Path to {ScanningTemplate}name.xml of experiment." tmpl = glob(_pattern(self.path, _additional_data, _scanning_template, extension='*.xml')) if tmpl: return tmpl[0] else: return ''
[ "Path", "to", "{", "ScanningTemplate", "}", "name", ".", "xml", "of", "experiment", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L113-L120
[ "def", "scanning_template", "(", "self", ")", ":", "tmpl", "=", "glob", "(", "_pattern", "(", "self", ".", "path", ",", "_additional_data", ",", "_scanning_template", ",", "extension", "=", "'*.xml'", ")", ")", "if", "tmpl", ":", "return", "tmpl", "[", "0", "]", "else", ":", "return", "''" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.well_rows
All well rows in experiment. Equivalent to --U in files. Returns ------- list of ints
leicaexperiment/experiment.py
def well_rows(self, well_row, well_column): """All well rows in experiment. Equivalent to --U in files. Returns ------- list of ints """ return list(set([attribute(img, 'u') for img in self.images]))
def well_rows(self, well_row, well_column): """All well rows in experiment. Equivalent to --U in files. Returns ------- list of ints """ return list(set([attribute(img, 'u') for img in self.images]))
[ "All", "well", "rows", "in", "experiment", ".", "Equivalent", "to", "--", "U", "in", "files", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L135-L142
[ "def", "well_rows", "(", "self", ",", "well_row", ",", "well_column", ")", ":", "return", "list", "(", "set", "(", "[", "attribute", "(", "img", ",", "'u'", ")", "for", "img", "in", "self", ".", "images", "]", ")", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.image
Get path of specified image. Parameters ---------- well_row : int Starts at 0. Same as --U in files. well_column : int Starts at 0. Same as --V in files. field_row : int Starts at 0. Same as --Y in files. field_column : int Starts at 0. Same as --X in files. Returns ------- string Path to image or empty string if image is not found.
leicaexperiment/experiment.py
def image(self, well_row, well_column, field_row, field_column): """Get path of specified image. Parameters ---------- well_row : int Starts at 0. Same as --U in files. well_column : int Starts at 0. Same as --V in files. field_row : int Starts at 0. Same as --Y in files. field_column : int Starts at 0. Same as --X in files. Returns ------- string Path to image or empty string if image is not found. """ return next((i for i in self.images if attribute(i, 'u') == well_column and attribute(i, 'v') == well_row and attribute(i, 'x') == field_column and attribute(i, 'y') == field_row), '')
def image(self, well_row, well_column, field_row, field_column): """Get path of specified image. Parameters ---------- well_row : int Starts at 0. Same as --U in files. well_column : int Starts at 0. Same as --V in files. field_row : int Starts at 0. Same as --Y in files. field_column : int Starts at 0. Same as --X in files. Returns ------- string Path to image or empty string if image is not found. """ return next((i for i in self.images if attribute(i, 'u') == well_column and attribute(i, 'v') == well_row and attribute(i, 'x') == field_column and attribute(i, 'y') == field_row), '')
[ "Get", "path", "of", "specified", "image", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L153-L176
[ "def", "image", "(", "self", ",", "well_row", ",", "well_column", ",", "field_row", ",", "field_column", ")", ":", "return", "next", "(", "(", "i", "for", "i", "in", "self", ".", "images", "if", "attribute", "(", "i", ",", "'u'", ")", "==", "well_column", "and", "attribute", "(", "i", ",", "'v'", ")", "==", "well_row", "and", "attribute", "(", "i", ",", "'x'", ")", "==", "field_column", "and", "attribute", "(", "i", ",", "'y'", ")", "==", "field_row", ")", ",", "''", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.well_images
Get list of paths to images in specified well. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Save as --U in files. Returns ------- list of strings Paths to images or empty list if no images are found.
leicaexperiment/experiment.py
def well_images(self, well_row, well_column): """Get list of paths to images in specified well. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Save as --U in files. Returns ------- list of strings Paths to images or empty list if no images are found. """ return list(i for i in self.images if attribute(i, 'u') == well_column and attribute(i, 'v') == well_row)
def well_images(self, well_row, well_column): """Get list of paths to images in specified well. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Save as --U in files. Returns ------- list of strings Paths to images or empty list if no images are found. """ return list(i for i in self.images if attribute(i, 'u') == well_column and attribute(i, 'v') == well_row)
[ "Get", "list", "of", "paths", "to", "images", "in", "specified", "well", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L179-L197
[ "def", "well_images", "(", "self", ",", "well_row", ",", "well_column", ")", ":", "return", "list", "(", "i", "for", "i", "in", "self", ".", "images", "if", "attribute", "(", "i", ",", "'u'", ")", "==", "well_column", "and", "attribute", "(", "i", ",", "'v'", ")", "==", "well_row", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.field_columns
Field columns for given well. Equivalent to --X in files. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Same as --U in files. Returns ------- list of ints Columns found for specified well.
leicaexperiment/experiment.py
def field_columns(self, well_row, well_column): """Field columns for given well. Equivalent to --X in files. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Same as --U in files. Returns ------- list of ints Columns found for specified well. """ imgs = self.well_images(well_row, well_column) return list(set([attribute(img, 'x') for img in imgs]))
def field_columns(self, well_row, well_column): """Field columns for given well. Equivalent to --X in files. Parameters ---------- well_row : int Starts at 0. Same as --V in files. well_column : int Starts at 0. Same as --U in files. Returns ------- list of ints Columns found for specified well. """ imgs = self.well_images(well_row, well_column) return list(set([attribute(img, 'x') for img in imgs]))
[ "Field", "columns", "for", "given", "well", ".", "Equivalent", "to", "--", "X", "in", "files", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L200-L216
[ "def", "field_columns", "(", "self", ",", "well_row", ",", "well_column", ")", ":", "imgs", "=", "self", ".", "well_images", "(", "well_row", ",", "well_column", ")", "return", "list", "(", "set", "(", "[", "attribute", "(", "img", ",", "'x'", ")", "for", "img", "in", "imgs", "]", ")", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.stitch
Stitches all wells in experiment with ImageJ. Stitched images are saved in experiment root. Images which already exists are omitted stitching. Parameters ---------- folder : string Where to store stitched images. Defaults to experiment path. Returns ------- list Filenames of stitched images. Files which already exists before stitching are also returned.
leicaexperiment/experiment.py
def stitch(self, folder=None): """Stitches all wells in experiment with ImageJ. Stitched images are saved in experiment root. Images which already exists are omitted stitching. Parameters ---------- folder : string Where to store stitched images. Defaults to experiment path. Returns ------- list Filenames of stitched images. Files which already exists before stitching are also returned. """ debug('stitching ' + self.__str__()) if not folder: folder = self.path # create list of macros and files macros = [] files = [] for well in self.wells: f,m = stitch_macro(well, folder) macros.extend(m) files.extend(f) chopped_arguments = zip(chop(macros, _pools), chop(files, _pools)) chopped_filenames = Parallel(n_jobs=_pools)(delayed(fijibin.macro.run) (macro=arg[0], output_files=arg[1]) for arg in chopped_arguments) # flatten return [f for list_ in chopped_filenames for f in list_]
def stitch(self, folder=None): """Stitches all wells in experiment with ImageJ. Stitched images are saved in experiment root. Images which already exists are omitted stitching. Parameters ---------- folder : string Where to store stitched images. Defaults to experiment path. Returns ------- list Filenames of stitched images. Files which already exists before stitching are also returned. """ debug('stitching ' + self.__str__()) if not folder: folder = self.path # create list of macros and files macros = [] files = [] for well in self.wells: f,m = stitch_macro(well, folder) macros.extend(m) files.extend(f) chopped_arguments = zip(chop(macros, _pools), chop(files, _pools)) chopped_filenames = Parallel(n_jobs=_pools)(delayed(fijibin.macro.run) (macro=arg[0], output_files=arg[1]) for arg in chopped_arguments) # flatten return [f for list_ in chopped_filenames for f in list_]
[ "Stitches", "all", "wells", "in", "experiment", "with", "ImageJ", ".", "Stitched", "images", "are", "saved", "in", "experiment", "root", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L238-L273
[ "def", "stitch", "(", "self", ",", "folder", "=", "None", ")", ":", "debug", "(", "'stitching '", "+", "self", ".", "__str__", "(", ")", ")", "if", "not", "folder", ":", "folder", "=", "self", ".", "path", "# create list of macros and files", "macros", "=", "[", "]", "files", "=", "[", "]", "for", "well", "in", "self", ".", "wells", ":", "f", ",", "m", "=", "stitch_macro", "(", "well", ",", "folder", ")", "macros", ".", "extend", "(", "m", ")", "files", ".", "extend", "(", "f", ")", "chopped_arguments", "=", "zip", "(", "chop", "(", "macros", ",", "_pools", ")", ",", "chop", "(", "files", ",", "_pools", ")", ")", "chopped_filenames", "=", "Parallel", "(", "n_jobs", "=", "_pools", ")", "(", "delayed", "(", "fijibin", ".", "macro", ".", "run", ")", "(", "macro", "=", "arg", "[", "0", "]", ",", "output_files", "=", "arg", "[", "1", "]", ")", "for", "arg", "in", "chopped_arguments", ")", "# flatten", "return", "[", "f", "for", "list_", "in", "chopped_filenames", "for", "f", "in", "list_", "]" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.compress
Lossless compress all images in experiment to PNG. If folder is omitted, images will not be moved. Images which already exists in PNG are omitted. Parameters ---------- folder : string Where to store PNGs. Defaults to the folder they are in. delete_tif : bool If set to truthy value, ome.tifs will be deleted after compression. Returns ------- list Filenames of PNG images. Files which already exists before compression are also returned.
leicaexperiment/experiment.py
def compress(self, delete_tif=False, folder=None): """Lossless compress all images in experiment to PNG. If folder is omitted, images will not be moved. Images which already exists in PNG are omitted. Parameters ---------- folder : string Where to store PNGs. Defaults to the folder they are in. delete_tif : bool If set to truthy value, ome.tifs will be deleted after compression. Returns ------- list Filenames of PNG images. Files which already exists before compression are also returned. """ return compress(self.images, delete_tif, folder)
def compress(self, delete_tif=False, folder=None): """Lossless compress all images in experiment to PNG. If folder is omitted, images will not be moved. Images which already exists in PNG are omitted. Parameters ---------- folder : string Where to store PNGs. Defaults to the folder they are in. delete_tif : bool If set to truthy value, ome.tifs will be deleted after compression. Returns ------- list Filenames of PNG images. Files which already exists before compression are also returned. """ return compress(self.images, delete_tif, folder)
[ "Lossless", "compress", "all", "images", "in", "experiment", "to", "PNG", ".", "If", "folder", "is", "omitted", "images", "will", "not", "be", "moved", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L276-L295
[ "def", "compress", "(", "self", ",", "delete_tif", "=", "False", ",", "folder", "=", "None", ")", ":", "return", "compress", "(", "self", ".", "images", ",", "delete_tif", ",", "folder", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.field_metadata
Get OME-XML metadata of given field. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. field_row : int Y field coordinate. Same as --Y in files. field_column : int X field coordinate. Same as --X in files. Returns ------- lxml.objectify.ObjectifiedElement lxml object of OME-XML found in slide/chamber/field/metadata.
leicaexperiment/experiment.py
def field_metadata(self, well_row=0, well_column=0, field_row=0, field_column=0): """Get OME-XML metadata of given field. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. field_row : int Y field coordinate. Same as --Y in files. field_column : int X field coordinate. Same as --X in files. Returns ------- lxml.objectify.ObjectifiedElement lxml object of OME-XML found in slide/chamber/field/metadata. """ def condition(path): attrs = attributes(path) return (attrs.u == well_column and attrs.v == well_row and attrs.x == field_column and attrs.y == field_row) field = [f for f in self.fields if condition(f)] if field: field = field[0] filename = _pattern(field, 'metadata', _image, extension='*.ome.xml') filename = glob(filename)[0] # resolve, assume found return objectify.parse(filename).getroot()
def field_metadata(self, well_row=0, well_column=0, field_row=0, field_column=0): """Get OME-XML metadata of given field. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. field_row : int Y field coordinate. Same as --Y in files. field_column : int X field coordinate. Same as --X in files. Returns ------- lxml.objectify.ObjectifiedElement lxml object of OME-XML found in slide/chamber/field/metadata. """ def condition(path): attrs = attributes(path) return (attrs.u == well_column and attrs.v == well_row and attrs.x == field_column and attrs.y == field_row) field = [f for f in self.fields if condition(f)] if field: field = field[0] filename = _pattern(field, 'metadata', _image, extension='*.ome.xml') filename = glob(filename)[0] # resolve, assume found return objectify.parse(filename).getroot()
[ "Get", "OME", "-", "XML", "metadata", "of", "given", "field", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L298-L330
[ "def", "field_metadata", "(", "self", ",", "well_row", "=", "0", ",", "well_column", "=", "0", ",", "field_row", "=", "0", ",", "field_column", "=", "0", ")", ":", "def", "condition", "(", "path", ")", ":", "attrs", "=", "attributes", "(", "path", ")", "return", "(", "attrs", ".", "u", "==", "well_column", "and", "attrs", ".", "v", "==", "well_row", "and", "attrs", ".", "x", "==", "field_column", "and", "attrs", ".", "y", "==", "field_row", ")", "field", "=", "[", "f", "for", "f", "in", "self", ".", "fields", "if", "condition", "(", "f", ")", "]", "if", "field", ":", "field", "=", "field", "[", "0", "]", "filename", "=", "_pattern", "(", "field", ",", "'metadata'", ",", "_image", ",", "extension", "=", "'*.ome.xml'", ")", "filename", "=", "glob", "(", "filename", ")", "[", "0", "]", "# resolve, assume found", "return", "objectify", ".", "parse", "(", "filename", ")", ".", "getroot", "(", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Experiment.stitch_coordinates
Get a list of stitch coordinates for the given well. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. Returns ------- (xs, ys, attr) : tuples with float and collections.OrderedDict Tuple of x's, y's and attributes.
leicaexperiment/experiment.py
def stitch_coordinates(self, well_row=0, well_column=0): """Get a list of stitch coordinates for the given well. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. Returns ------- (xs, ys, attr) : tuples with float and collections.OrderedDict Tuple of x's, y's and attributes. """ well = [w for w in self.wells if attribute(w, 'u') == well_column and attribute(w, 'v') == well_row] if len(well) == 1: well = well[0] tile = os.path.join(well, 'TileConfiguration.registered.txt') with open(tile) as f: data = [x.strip() for l in f.readlines() if l[0:7] == 'image--' for x in l.split(';')] # flat list coordinates = (ast.literal_eval(x) for x in data[2::3]) # flatten coordinates = sum(coordinates, ()) attr = tuple(attributes(x) for x in data[0::3]) return coordinates[0::2], coordinates[1::2], attr else: print('leicaexperiment stitch_coordinates' '({}, {}) Well not found'.format(well_row, well_column))
def stitch_coordinates(self, well_row=0, well_column=0): """Get a list of stitch coordinates for the given well. Parameters ---------- well_row : int Y well coordinate. Same as --V in files. well_column : int X well coordinate. Same as --U in files. Returns ------- (xs, ys, attr) : tuples with float and collections.OrderedDict Tuple of x's, y's and attributes. """ well = [w for w in self.wells if attribute(w, 'u') == well_column and attribute(w, 'v') == well_row] if len(well) == 1: well = well[0] tile = os.path.join(well, 'TileConfiguration.registered.txt') with open(tile) as f: data = [x.strip() for l in f.readlines() if l[0:7] == 'image--' for x in l.split(';')] # flat list coordinates = (ast.literal_eval(x) for x in data[2::3]) # flatten coordinates = sum(coordinates, ()) attr = tuple(attributes(x) for x in data[0::3]) return coordinates[0::2], coordinates[1::2], attr else: print('leicaexperiment stitch_coordinates' '({}, {}) Well not found'.format(well_row, well_column))
[ "Get", "a", "list", "of", "stitch", "coordinates", "for", "the", "given", "well", "." ]
arve0/leicaexperiment
python
https://github.com/arve0/leicaexperiment/blob/c0393c4d51984a506f813319efb66e54c4f2a426/leicaexperiment/experiment.py#L333-L369
[ "def", "stitch_coordinates", "(", "self", ",", "well_row", "=", "0", ",", "well_column", "=", "0", ")", ":", "well", "=", "[", "w", "for", "w", "in", "self", ".", "wells", "if", "attribute", "(", "w", ",", "'u'", ")", "==", "well_column", "and", "attribute", "(", "w", ",", "'v'", ")", "==", "well_row", "]", "if", "len", "(", "well", ")", "==", "1", ":", "well", "=", "well", "[", "0", "]", "tile", "=", "os", ".", "path", ".", "join", "(", "well", ",", "'TileConfiguration.registered.txt'", ")", "with", "open", "(", "tile", ")", "as", "f", ":", "data", "=", "[", "x", ".", "strip", "(", ")", "for", "l", "in", "f", ".", "readlines", "(", ")", "if", "l", "[", "0", ":", "7", "]", "==", "'image--'", "for", "x", "in", "l", ".", "split", "(", "';'", ")", "]", "# flat list", "coordinates", "=", "(", "ast", ".", "literal_eval", "(", "x", ")", "for", "x", "in", "data", "[", "2", ":", ":", "3", "]", ")", "# flatten", "coordinates", "=", "sum", "(", "coordinates", ",", "(", ")", ")", "attr", "=", "tuple", "(", "attributes", "(", "x", ")", "for", "x", "in", "data", "[", "0", ":", ":", "3", "]", ")", "return", "coordinates", "[", "0", ":", ":", "2", "]", ",", "coordinates", "[", "1", ":", ":", "2", "]", ",", "attr", "else", ":", "print", "(", "'leicaexperiment stitch_coordinates'", "'({}, {}) Well not found'", ".", "format", "(", "well_row", ",", "well_column", ")", ")" ]
c0393c4d51984a506f813319efb66e54c4f2a426
valid
Droplets.create
Create a new droplet Parameters ---------- name: str Name of new droplet region: str slug for region (e.g., sfo1, nyc1) size: str slug for droplet size (e.g., 512mb, 1024mb) image: int or str image id (e.g., 12352) or slug (e.g., 'ubuntu-14-04-x64') ssh_keys: list, optional default SSH keys to be added on creation this is highly recommended for ssh access backups: bool, optional whether automated backups should be enabled for the Droplet. Automated backups can only be enabled when the Droplet is created. ipv6: bool, optional whether IPv6 is enabled on the Droplet private_networking: bool, optional whether private networking is enabled for the Droplet. Private networking is currently only available in certain regions wait: bool, default True if True then block until creation is complete
poseidon/droplet.py
def create(self, name, region, size, image, ssh_keys=None, backups=None, ipv6=None, private_networking=None, wait=True): """ Create a new droplet Parameters ---------- name: str Name of new droplet region: str slug for region (e.g., sfo1, nyc1) size: str slug for droplet size (e.g., 512mb, 1024mb) image: int or str image id (e.g., 12352) or slug (e.g., 'ubuntu-14-04-x64') ssh_keys: list, optional default SSH keys to be added on creation this is highly recommended for ssh access backups: bool, optional whether automated backups should be enabled for the Droplet. Automated backups can only be enabled when the Droplet is created. ipv6: bool, optional whether IPv6 is enabled on the Droplet private_networking: bool, optional whether private networking is enabled for the Droplet. Private networking is currently only available in certain regions wait: bool, default True if True then block until creation is complete """ if ssh_keys and not isinstance(ssh_keys, (list, tuple)): raise TypeError("ssh_keys must be a list") resp = self.post(name=name, region=region, size=size, image=image, ssh_keys=ssh_keys, private_networking=private_networking, backups=backups, ipv6=ipv6) droplet = self.get(resp[self.singular]['id']) if wait: droplet.wait() # HACK sometimes the IP address doesn't return correctly droplet = self.get(resp[self.singular]['id']) return droplet
def create(self, name, region, size, image, ssh_keys=None, backups=None, ipv6=None, private_networking=None, wait=True): """ Create a new droplet Parameters ---------- name: str Name of new droplet region: str slug for region (e.g., sfo1, nyc1) size: str slug for droplet size (e.g., 512mb, 1024mb) image: int or str image id (e.g., 12352) or slug (e.g., 'ubuntu-14-04-x64') ssh_keys: list, optional default SSH keys to be added on creation this is highly recommended for ssh access backups: bool, optional whether automated backups should be enabled for the Droplet. Automated backups can only be enabled when the Droplet is created. ipv6: bool, optional whether IPv6 is enabled on the Droplet private_networking: bool, optional whether private networking is enabled for the Droplet. Private networking is currently only available in certain regions wait: bool, default True if True then block until creation is complete """ if ssh_keys and not isinstance(ssh_keys, (list, tuple)): raise TypeError("ssh_keys must be a list") resp = self.post(name=name, region=region, size=size, image=image, ssh_keys=ssh_keys, private_networking=private_networking, backups=backups, ipv6=ipv6) droplet = self.get(resp[self.singular]['id']) if wait: droplet.wait() # HACK sometimes the IP address doesn't return correctly droplet = self.get(resp[self.singular]['id']) return droplet
[ "Create", "a", "new", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L80-L120
[ "def", "create", "(", "self", ",", "name", ",", "region", ",", "size", ",", "image", ",", "ssh_keys", "=", "None", ",", "backups", "=", "None", ",", "ipv6", "=", "None", ",", "private_networking", "=", "None", ",", "wait", "=", "True", ")", ":", "if", "ssh_keys", "and", "not", "isinstance", "(", "ssh_keys", ",", "(", "list", ",", "tuple", ")", ")", ":", "raise", "TypeError", "(", "\"ssh_keys must be a list\"", ")", "resp", "=", "self", ".", "post", "(", "name", "=", "name", ",", "region", "=", "region", ",", "size", "=", "size", ",", "image", "=", "image", ",", "ssh_keys", "=", "ssh_keys", ",", "private_networking", "=", "private_networking", ",", "backups", "=", "backups", ",", "ipv6", "=", "ipv6", ")", "droplet", "=", "self", ".", "get", "(", "resp", "[", "self", ".", "singular", "]", "[", "'id'", "]", ")", "if", "wait", ":", "droplet", ".", "wait", "(", ")", "# HACK sometimes the IP address doesn't return correctly", "droplet", "=", "self", ".", "get", "(", "resp", "[", "self", ".", "singular", "]", "[", "'id'", "]", ")", "return", "droplet" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Droplets.get
Retrieve a droplet by id Parameters ---------- id: int droplet id Returns ------- droplet: DropletActions
poseidon/droplet.py
def get(self, id): """ Retrieve a droplet by id Parameters ---------- id: int droplet id Returns ------- droplet: DropletActions """ info = self._get_droplet_info(id) return DropletActions(self.api, self, **info)
def get(self, id): """ Retrieve a droplet by id Parameters ---------- id: int droplet id Returns ------- droplet: DropletActions """ info = self._get_droplet_info(id) return DropletActions(self.api, self, **info)
[ "Retrieve", "a", "droplet", "by", "id" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L122-L136
[ "def", "get", "(", "self", ",", "id", ")", ":", "info", "=", "self", ".", "_get_droplet_info", "(", "id", ")", "return", "DropletActions", "(", "self", ".", "api", ",", "self", ",", "*", "*", "info", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Droplets.by_name
Retrieve a droplet by name (return first if duplicated) Parameters ---------- name: str droplet name Returns ------- droplet: DropletActions
poseidon/droplet.py
def by_name(self, name): """ Retrieve a droplet by name (return first if duplicated) Parameters ---------- name: str droplet name Returns ------- droplet: DropletActions """ for d in self.list(): if d['name'] == name: return self.get(d['id']) raise KeyError("Could not find droplet with name %s" % name)
def by_name(self, name): """ Retrieve a droplet by name (return first if duplicated) Parameters ---------- name: str droplet name Returns ------- droplet: DropletActions """ for d in self.list(): if d['name'] == name: return self.get(d['id']) raise KeyError("Could not find droplet with name %s" % name)
[ "Retrieve", "a", "droplet", "by", "name", "(", "return", "first", "if", "duplicated", ")" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L141-L157
[ "def", "by_name", "(", "self", ",", "name", ")", ":", "for", "d", "in", "self", ".", "list", "(", ")", ":", "if", "d", "[", "'name'", "]", "==", "name", ":", "return", "self", ".", "get", "(", "d", "[", "'id'", "]", ")", "raise", "KeyError", "(", "\"Could not find droplet with name %s\"", "%", "name", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.resize
Change the size of this droplet (must be powered off) Parameters ---------- size: str size slug, e.g., 512mb wait: bool, default True Whether to block until the pending action is completed
poseidon/droplet.py
def resize(self, size, wait=True): """ Change the size of this droplet (must be powered off) Parameters ---------- size: str size slug, e.g., 512mb wait: bool, default True Whether to block until the pending action is completed """ return self._action('resize', size=size, wait=wait)
def resize(self, size, wait=True): """ Change the size of this droplet (must be powered off) Parameters ---------- size: str size slug, e.g., 512mb wait: bool, default True Whether to block until the pending action is completed """ return self._action('resize', size=size, wait=wait)
[ "Change", "the", "size", "of", "this", "droplet", "(", "must", "be", "powered", "off", ")" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L326-L337
[ "def", "resize", "(", "self", ",", "size", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'resize'", ",", "size", "=", "size", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.restore
Restore this droplet with given image id A Droplet restoration will rebuild an image using a backup image. The image ID that is passed in must be a backup of the current Droplet instance. The operation will leave any embedded SSH keys intact. Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed
poseidon/droplet.py
def restore(self, image, wait=True): """ Restore this droplet with given image id A Droplet restoration will rebuild an image using a backup image. The image ID that is passed in must be a backup of the current Droplet instance. The operation will leave any embedded SSH keys intact. Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed """ return self._action('restore', image=image, wait=wait)
def restore(self, image, wait=True): """ Restore this droplet with given image id A Droplet restoration will rebuild an image using a backup image. The image ID that is passed in must be a backup of the current Droplet instance. The operation will leave any embedded SSH keys intact. Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed """ return self._action('restore', image=image, wait=wait)
[ "Restore", "this", "droplet", "with", "given", "image", "id" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L339-L354
[ "def", "restore", "(", "self", ",", "image", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'restore'", ",", "image", "=", "image", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.rebuild
Rebuild this droplet with given image id Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed
poseidon/droplet.py
def rebuild(self, image, wait=True): """ Rebuild this droplet with given image id Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed """ return self._action('rebuild', image=image, wait=wait)
def rebuild(self, image, wait=True): """ Rebuild this droplet with given image id Parameters ---------- image: int or str int for image id and str for image slug wait: bool, default True Whether to block until the pending action is completed """ return self._action('rebuild', image=image, wait=wait)
[ "Rebuild", "this", "droplet", "with", "given", "image", "id" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L356-L367
[ "def", "rebuild", "(", "self", ",", "image", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'rebuild'", ",", "image", "=", "image", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.rename
Change the name of this droplet Parameters ---------- name: str New name for the droplet wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking
poseidon/droplet.py
def rename(self, name, wait=True): """ Change the name of this droplet Parameters ---------- name: str New name for the droplet wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking """ return self._action('rename', name=name, wait=wait)
def rename(self, name, wait=True): """ Change the name of this droplet Parameters ---------- name: str New name for the droplet wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking """ return self._action('rename', name=name, wait=wait)
[ "Change", "the", "name", "of", "this", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L369-L384
[ "def", "rename", "(", "self", ",", "name", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'rename'", ",", "name", "=", "name", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.change_kernel
Change the kernel of this droplet Parameters ---------- kernel_id: int Can be retrieved from output of self.kernels() wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking
poseidon/droplet.py
def change_kernel(self, kernel_id, wait=True): """ Change the kernel of this droplet Parameters ---------- kernel_id: int Can be retrieved from output of self.kernels() wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking """ return self._action('change_kernel', kernel=kernel_id, wait=wait)
def change_kernel(self, kernel_id, wait=True): """ Change the kernel of this droplet Parameters ---------- kernel_id: int Can be retrieved from output of self.kernels() wait: bool, default True Whether to block until the pending action is completed Raises ------ APIError if region does not support private networking """ return self._action('change_kernel', kernel=kernel_id, wait=wait)
[ "Change", "the", "kernel", "of", "this", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L386-L401
[ "def", "change_kernel", "(", "self", ",", "kernel_id", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'change_kernel'", ",", "kernel", "=", "kernel_id", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.take_snapshot
Take a snapshot of this droplet (must be powered off) Parameters ---------- name: str Name of the snapshot wait: bool, default True Whether to block until the pending action is completed
poseidon/droplet.py
def take_snapshot(self, name, wait=True): """ Take a snapshot of this droplet (must be powered off) Parameters ---------- name: str Name of the snapshot wait: bool, default True Whether to block until the pending action is completed """ return self._action('snapshot', name=name, wait=wait)
def take_snapshot(self, name, wait=True): """ Take a snapshot of this droplet (must be powered off) Parameters ---------- name: str Name of the snapshot wait: bool, default True Whether to block until the pending action is completed """ return self._action('snapshot', name=name, wait=wait)
[ "Take", "a", "snapshot", "of", "this", "droplet", "(", "must", "be", "powered", "off", ")" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L403-L414
[ "def", "take_snapshot", "(", "self", ",", "name", ",", "wait", "=", "True", ")", ":", "return", "self", ".", "_action", "(", "'snapshot'", ",", "name", "=", "name", ",", "wait", "=", "wait", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.delete
Delete this droplet Parameters ---------- wait: bool, default True Whether to block until the pending action is completed
poseidon/droplet.py
def delete(self, wait=True): """ Delete this droplet Parameters ---------- wait: bool, default True Whether to block until the pending action is completed """ resp = self.parent.delete(self.id) if wait: self.wait() return resp
def delete(self, wait=True): """ Delete this droplet Parameters ---------- wait: bool, default True Whether to block until the pending action is completed """ resp = self.parent.delete(self.id) if wait: self.wait() return resp
[ "Delete", "this", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L453-L465
[ "def", "delete", "(", "self", ",", "wait", "=", "True", ")", ":", "resp", "=", "self", ".", "parent", ".", "delete", "(", "self", ".", "id", ")", "if", "wait", ":", "self", ".", "wait", "(", ")", "return", "resp" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.wait
wait for all actions to complete on a droplet
poseidon/droplet.py
def wait(self): """ wait for all actions to complete on a droplet """ interval_seconds = 5 while True: actions = self.actions() slept = False for a in actions: if a['status'] == 'in-progress': # n.b. gevent will monkey patch time.sleep(interval_seconds) slept = True break if not slept: break
def wait(self): """ wait for all actions to complete on a droplet """ interval_seconds = 5 while True: actions = self.actions() slept = False for a in actions: if a['status'] == 'in-progress': # n.b. gevent will monkey patch time.sleep(interval_seconds) slept = True break if not slept: break
[ "wait", "for", "all", "actions", "to", "complete", "on", "a", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L467-L482
[ "def", "wait", "(", "self", ")", ":", "interval_seconds", "=", "5", "while", "True", ":", "actions", "=", "self", ".", "actions", "(", ")", "slept", "=", "False", "for", "a", "in", "actions", ":", "if", "a", "[", "'status'", "]", "==", "'in-progress'", ":", "# n.b. gevent will monkey patch", "time", ".", "sleep", "(", "interval_seconds", ")", "slept", "=", "True", "break", "if", "not", "slept", ":", "break" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.ip_address
Public ip_address
poseidon/droplet.py
def ip_address(self): """ Public ip_address """ ip = None for eth in self.networks['v4']: if eth['type'] == 'public': ip = eth['ip_address'] break if ip is None: raise ValueError("No public IP found") return ip
def ip_address(self): """ Public ip_address """ ip = None for eth in self.networks['v4']: if eth['type'] == 'public': ip = eth['ip_address'] break if ip is None: raise ValueError("No public IP found") return ip
[ "Public", "ip_address" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L485-L496
[ "def", "ip_address", "(", "self", ")", ":", "ip", "=", "None", "for", "eth", "in", "self", ".", "networks", "[", "'v4'", "]", ":", "if", "eth", "[", "'type'", "]", "==", "'public'", ":", "ip", "=", "eth", "[", "'ip_address'", "]", "break", "if", "ip", "is", "None", ":", "raise", "ValueError", "(", "\"No public IP found\"", ")", "return", "ip" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.private_ip
Private ip_address
poseidon/droplet.py
def private_ip(self): """ Private ip_address """ ip = None for eth in self.networks['v4']: if eth['type'] == 'private': ip = eth['ip_address'] break if ip is None: raise ValueError("No private IP found") return ip
def private_ip(self): """ Private ip_address """ ip = None for eth in self.networks['v4']: if eth['type'] == 'private': ip = eth['ip_address'] break if ip is None: raise ValueError("No private IP found") return ip
[ "Private", "ip_address" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L499-L510
[ "def", "private_ip", "(", "self", ")", ":", "ip", "=", "None", "for", "eth", "in", "self", ".", "networks", "[", "'v4'", "]", ":", "if", "eth", "[", "'type'", "]", "==", "'private'", ":", "ip", "=", "eth", "[", "'ip_address'", "]", "break", "if", "ip", "is", "None", ":", "raise", "ValueError", "(", "\"No private IP found\"", ")", "return", "ip" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DropletActions.connect
Open SSH connection to droplet Parameters ---------- interactive: bool, default False If True then SSH client will prompt for password when necessary and also print output to console
poseidon/droplet.py
def connect(self, interactive=False): """ Open SSH connection to droplet Parameters ---------- interactive: bool, default False If True then SSH client will prompt for password when necessary and also print output to console """ from poseidon.ssh import SSHClient rs = SSHClient(self.ip_address, interactive=interactive) return rs
def connect(self, interactive=False): """ Open SSH connection to droplet Parameters ---------- interactive: bool, default False If True then SSH client will prompt for password when necessary and also print output to console """ from poseidon.ssh import SSHClient rs = SSHClient(self.ip_address, interactive=interactive) return rs
[ "Open", "SSH", "connection", "to", "droplet" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/droplet.py#L512-L524
[ "def", "connect", "(", "self", ",", "interactive", "=", "False", ")", ":", "from", "poseidon", ".", "ssh", "import", "SSHClient", "rs", "=", "SSHClient", "(", "self", ".", "ip_address", ",", "interactive", "=", "interactive", ")", "return", "rs" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
RestAPI.send_request
Send a request to the REST API Parameters ---------- kind: str, {get, delete, put, post, head} resource: str url_components: list or tuple to be appended to the request URL Notes ----- kwargs contain request parameters to be sent as request data
poseidon/api.py
def send_request(self, kind, resource, url_components, **kwargs): """ Send a request to the REST API Parameters ---------- kind: str, {get, delete, put, post, head} resource: str url_components: list or tuple to be appended to the request URL Notes ----- kwargs contain request parameters to be sent as request data """ url = self.format_request_url(resource, *url_components) meth = getattr(requests, kind) headers = self.get_request_headers() req_data = self.format_parameters(**kwargs) response = meth(url, headers=headers, data=req_data) data = self.get_response(response) if response.status_code >= 300: msg = data.pop('message', 'API request returned error') raise APIError(msg, response.status_code, **data) return data
def send_request(self, kind, resource, url_components, **kwargs): """ Send a request to the REST API Parameters ---------- kind: str, {get, delete, put, post, head} resource: str url_components: list or tuple to be appended to the request URL Notes ----- kwargs contain request parameters to be sent as request data """ url = self.format_request_url(resource, *url_components) meth = getattr(requests, kind) headers = self.get_request_headers() req_data = self.format_parameters(**kwargs) response = meth(url, headers=headers, data=req_data) data = self.get_response(response) if response.status_code >= 300: msg = data.pop('message', 'API request returned error') raise APIError(msg, response.status_code, **data) return data
[ "Send", "a", "request", "to", "the", "REST", "API" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L47-L70
[ "def", "send_request", "(", "self", ",", "kind", ",", "resource", ",", "url_components", ",", "*", "*", "kwargs", ")", ":", "url", "=", "self", ".", "format_request_url", "(", "resource", ",", "*", "url_components", ")", "meth", "=", "getattr", "(", "requests", ",", "kind", ")", "headers", "=", "self", ".", "get_request_headers", "(", ")", "req_data", "=", "self", ".", "format_parameters", "(", "*", "*", "kwargs", ")", "response", "=", "meth", "(", "url", ",", "headers", "=", "headers", ",", "data", "=", "req_data", ")", "data", "=", "self", ".", "get_response", "(", "response", ")", "if", "response", ".", "status_code", ">=", "300", ":", "msg", "=", "data", ".", "pop", "(", "'message'", ",", "'API request returned error'", ")", "raise", "APIError", "(", "msg", ",", "response", ".", "status_code", ",", "*", "*", "data", ")", "return", "data" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
RestAPI.format_parameters
Properly formats array types
poseidon/api.py
def format_parameters(self, **kwargs): """ Properly formats array types """ req_data = {} for k, v in kwargs.items(): if isinstance(v, (list, tuple)): k = k + '[]' req_data[k] = v return req_data
def format_parameters(self, **kwargs): """ Properly formats array types """ req_data = {} for k, v in kwargs.items(): if isinstance(v, (list, tuple)): k = k + '[]' req_data[k] = v return req_data
[ "Properly", "formats", "array", "types" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L91-L100
[ "def", "format_parameters", "(", "self", ",", "*", "*", "kwargs", ")", ":", "req_data", "=", "{", "}", "for", "k", ",", "v", "in", "kwargs", ".", "items", "(", ")", ":", "if", "isinstance", "(", "v", ",", "(", "list", ",", "tuple", ")", ")", ":", "k", "=", "k", "+", "'[]'", "req_data", "[", "k", "]", "=", "v", "return", "req_data" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
DigitalOceanAPI.format_request_url
create request url for resource
poseidon/api.py
def format_request_url(self, resource, *args): """create request url for resource""" return '/'.join((self.api_url, self.api_version, resource) + tuple(str(x) for x in args))
def format_request_url(self, resource, *args): """create request url for resource""" return '/'.join((self.api_url, self.api_version, resource) + tuple(str(x) for x in args))
[ "create", "request", "url", "for", "resource" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L134-L137
[ "def", "format_request_url", "(", "self", ",", "resource", ",", "*", "args", ")", ":", "return", "'/'", ".", "join", "(", "(", "self", ".", "api_url", ",", "self", ".", "api_version", ",", "resource", ")", "+", "tuple", "(", "str", "(", "x", ")", "for", "x", "in", "args", ")", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Resource.send_request
Send a request for this resource to the API Parameters ---------- kind: str, {'get', 'delete', 'put', 'post', 'head'}
poseidon/api.py
def send_request(self, kind, url_components, **kwargs): """ Send a request for this resource to the API Parameters ---------- kind: str, {'get', 'delete', 'put', 'post', 'head'} """ return self.api.send_request(kind, self.resource_path, url_components, **kwargs)
def send_request(self, kind, url_components, **kwargs): """ Send a request for this resource to the API Parameters ---------- kind: str, {'get', 'delete', 'put', 'post', 'head'} """ return self.api.send_request(kind, self.resource_path, url_components, **kwargs)
[ "Send", "a", "request", "for", "this", "resource", "to", "the", "API" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L156-L165
[ "def", "send_request", "(", "self", ",", "kind", ",", "url_components", ",", "*", "*", "kwargs", ")", ":", "return", "self", ".", "api", ".", "send_request", "(", "kind", ",", "self", ".", "resource_path", ",", "url_components", ",", "*", "*", "kwargs", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
ResourceCollection.list
Send list request for all members of a collection
poseidon/api.py
def list(self, url_components=()): """ Send list request for all members of a collection """ resp = self.get(url_components) return resp.get(self.result_key, [])
def list(self, url_components=()): """ Send list request for all members of a collection """ resp = self.get(url_components) return resp.get(self.result_key, [])
[ "Send", "list", "request", "for", "all", "members", "of", "a", "collection" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L205-L210
[ "def", "list", "(", "self", ",", "url_components", "=", "(", ")", ")", ":", "resp", "=", "self", ".", "get", "(", "url_components", ")", "return", "resp", ".", "get", "(", "self", ".", "result_key", ",", "[", "]", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
MutableCollection.get
Get single unit of collection
poseidon/api.py
def get(self, id, **kwargs): """ Get single unit of collection """ return (super(MutableCollection, self).get((id,), **kwargs) .get(self.singular, None))
def get(self, id, **kwargs): """ Get single unit of collection """ return (super(MutableCollection, self).get((id,), **kwargs) .get(self.singular, None))
[ "Get", "single", "unit", "of", "collection" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L245-L250
[ "def", "get", "(", "self", ",", "id", ",", "*", "*", "kwargs", ")", ":", "return", "(", "super", "(", "MutableCollection", ",", "self", ")", ".", "get", "(", "(", "id", ",", ")", ",", "*", "*", "kwargs", ")", ".", "get", "(", "self", ".", "singular", ",", "None", ")", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
ImageActions.transfer
Transfer this image to given region Parameters ---------- region: str region slug to transfer to (e.g., sfo1, nyc1)
poseidon/api.py
def transfer(self, region): """ Transfer this image to given region Parameters ---------- region: str region slug to transfer to (e.g., sfo1, nyc1) """ action = self.post(type='transfer', region=region)['action'] return self.parent.get(action['resource_id'])
def transfer(self, region): """ Transfer this image to given region Parameters ---------- region: str region slug to transfer to (e.g., sfo1, nyc1) """ action = self.post(type='transfer', region=region)['action'] return self.parent.get(action['resource_id'])
[ "Transfer", "this", "image", "to", "given", "region" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L270-L280
[ "def", "transfer", "(", "self", ",", "region", ")", ":", "action", "=", "self", ".", "post", "(", "type", "=", "'transfer'", ",", "region", "=", "region", ")", "[", "'action'", "]", "return", "self", ".", "parent", ".", "get", "(", "action", "[", "'resource_id'", "]", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Images.get
id or slug
poseidon/api.py
def get(self, id): """id or slug""" info = super(Images, self).get(id) return ImageActions(self.api, parent=self, **info)
def get(self, id): """id or slug""" info = super(Images, self).get(id) return ImageActions(self.api, parent=self, **info)
[ "id", "or", "slug" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L306-L309
[ "def", "get", "(", "self", ",", "id", ")", ":", "info", "=", "super", "(", "Images", ",", "self", ")", ".", "get", "(", "id", ")", "return", "ImageActions", "(", "self", ".", "api", ",", "parent", "=", "self", ",", "*", "*", "info", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Keys.update
id or fingerprint
poseidon/api.py
def update(self, id, name): """id or fingerprint""" return super(Keys, self).update(id, name=name)
def update(self, id, name): """id or fingerprint""" return super(Keys, self).update(id, name=name)
[ "id", "or", "fingerprint" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L326-L328
[ "def", "update", "(", "self", ",", "id", ",", "name", ")", ":", "return", "super", "(", "Keys", ",", "self", ")", ".", "update", "(", "id", ",", "name", "=", "name", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Domains.create
Creates a new domain Parameters ---------- name: str new domain name ip_address: str IP address for the new domain
poseidon/api.py
def create(self, name, ip_address): """ Creates a new domain Parameters ---------- name: str new domain name ip_address: str IP address for the new domain """ return (self.post(name=name, ip_address=ip_address) .get(self.singular, None))
def create(self, name, ip_address): """ Creates a new domain Parameters ---------- name: str new domain name ip_address: str IP address for the new domain """ return (self.post(name=name, ip_address=ip_address) .get(self.singular, None))
[ "Creates", "a", "new", "domain" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L348-L360
[ "def", "create", "(", "self", ",", "name", ",", "ip_address", ")", ":", "return", "(", "self", ".", "post", "(", "name", "=", "name", ",", "ip_address", "=", "ip_address", ")", ".", "get", "(", "self", ".", "singular", ",", "None", ")", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f
valid
Domains.records
Get a list of all domain records for the given domain name Parameters ---------- name: str domain name
poseidon/api.py
def records(self, name): """ Get a list of all domain records for the given domain name Parameters ---------- name: str domain name """ if self.get(name): return DomainRecords(self.api, name)
def records(self, name): """ Get a list of all domain records for the given domain name Parameters ---------- name: str domain name """ if self.get(name): return DomainRecords(self.api, name)
[ "Get", "a", "list", "of", "all", "domain", "records", "for", "the", "given", "domain", "name" ]
changhiskhan/poseidon
python
https://github.com/changhiskhan/poseidon/blob/6d1cecbe02f1e510dd185fe23f88f7af35eb737f/poseidon/api.py#L362-L372
[ "def", "records", "(", "self", ",", "name", ")", ":", "if", "self", ".", "get", "(", "name", ")", ":", "return", "DomainRecords", "(", "self", ".", "api", ",", "name", ")" ]
6d1cecbe02f1e510dd185fe23f88f7af35eb737f