Please login or register to access secure site features.

Note: By continuing to use DevConnect Program Services you agree to our latest Registered Member Terms.

Sign in using DevConnect ID

Forgot password?

Trouble logging in?

Submit a ticket for Registration Support.

I have an SSO ID

?
sign in

Don't have a DevConnect or SSO ID ?

Create a DevConnect account or join the program.

register now
^

FAQ: Application Enablement Services

See All Content
X

FAQ: DMCC

Other

Why does MonitoringServices.addCallControlListener() fail with DMCC Java SDK 8.1.3?

An error in the DMCC Java SDK version 8.1.3 means it is not possible to start a Call Control monitor when using a DMCC protocol older than 8.1.3 (http://www.ecma-international.org/standards/ecma-323/csta/ed3/privE).  This means that any application built with the 8.1.3 SDK cannot be used with an older version of AE Services.

This problem will be fixed in version 8.1.3.2 of the DMCC Java SDK, due to be released in the second half of 2021.

In the meantime, a hotfix is available on the Devconnect website here. On this page you will find the following files:

  • DMCC Java SDK version 8.1.3 (both Windows and Linux)
  • A link to the PSN document which describes how to install the hotfix on top of the SDK
  • The hotfix files as a zip archive.

The hotfix comprises two files (one jar and one xml).  Developers who need fixes/features in the DMCC Java SDK version 8.1.3 should use these files, instead of the equivalent files in the SDK, when building, testing and deploying an application that uses the 8.1.3 version of the DMCC Java SDK. The hotfix is recommended to be applied in all situations.

Can you describe the MedPro and Timeslot resource allocation strategy for conference scenarios where multiple IP-network regions and different endpoint types are involved?

The purpose of the following scenarios and questions is to understand the mechanism which determines on which Gateway (or Gateways) will a conference call be held (also in terms of MedPro/VoIP and Time-slots resources) in a system which include several regions and several Gateways (or Port Networks).

Please elaborate on the exact way each scenario in the table below will set up the conference between the 3 participants in each scenario (the participants belong to different regions according to the different columns).

I consider the IP-phones to be configured as direct IP-IP audio in both inter and intra region communications in an IP-Connect configuration.

Can Communication Manager provide tones into a call while it is being recorded?

Many countries have regulations which require that a tone be inserted into a call while it is recorded.  Communication Manager can be configured to generate such a tone. Additionally, the application can configure a device to insert a tone when that device is involved in a call in some recording configurations.

Service Observing Recorders
If the Service Observing (SO) methodology of call recording is being used, then Communication Manager can be configured to insert a recurring tone into the call through the Service Observing: Warning Tone? parameter on the 'change system-parameters features' SAT form in the Call Center section (page 11/19 in release 8.1).
When this parameter is enabled (it is enabled by default), a warning tone is played into a call at regular intervals while a service observer is present in the call. It is not possible to alter this tone.

Single-Step Conference (SSC) and Multiple Registration (MR) Recorders
As of AE Services 6.3 and CM 6.3, a DMCC device can be pre-configured to cause the Service Observer Warning Tone to be inserted into a call when the device is added into the call.

The application creates the device (the recorder) and then configures it to add recording tone using the Generate Telephony Tones feature.  After that, any time the device is added into a call (via SSC or Multiple Registrations), the tone is provided to the call.   Currently (release 8.1) it is not possible to alter this tone. The tone is the same tone that is used with Service Observing. The tone cadence is:

  • 1400 Hz at -11 db for 200 msec. 
  • Silence for 15 sec
  • repeat [1400, silence] forever

For example, to enable the Generate Telephony Tones feature for a recorder, using .Net:

serviceProvider.getCallAssociated.GenerateTelephonyTones(

recordingDevice.getDeviceIdAsString, null);

See the appropriate DMCC programmers guide for more information on Generate Telephony Tones.

Alternative Methods

Conference Tone

If the Single Step Conference (SSC) methodology of call recording is being used, then Communication Manager can be configured to provide a recurring tone to all conference calls. This is enabled through the Conference Tone flag of the 'change system-parameters features' SAT form (page 6/19 in release 8.1). The system must also have a conference tone configured which will generate a repeating tone.  For example, the following causes a short tone to be generated every 2 seconds.

change tone-generation                                          Page   2 of  21

TONE GENERATION CUSTOMIZED TONES

Tone Name        Cadence        Tone
Step    (Frequency/Level)

conference             

1:     330/-8.0           Duration(msec): 200
2:     silence              Duration(msec): 2000
3:     goto                         Step: 1

Note: ALL conference calls will hear this tone, regardless of the call being recorded or not.

Application Injected Tone

When SSC or SO are in use for recording, the recording application has a media capable endpoint (the DMCC based recorder) connected into the call.  The application can use this device to send RTP representing a tone to Communication Manager that will be summed into the audio stream sent to all the call participants.

Normally, it is not required for the DMCC endpoint to inject RTP into a call so the application adds it in listen only (SO) or silent (SSC) mode.   In order to be able to inject RTP into a call, the recorder must be added in listen/talk (SO) or active (SSC) mode.

There are some draw-backs with using non-silent mode.  Firstly, it consumes an extra talk timeslot for the recorder which can impact the maximum number of simultaneous recordings that can be performed on a G4x0 gateway as compared to silent mode recording.  Secondly, for SSC, the phone displays may show “Conference” instead of the number/name of the other party.

If MR is used for recording, the monitored phone and the DMCC device cannot ‘talk’ at the same time.  In order to inject RTP into the call, the application must use the share-talk feature to be able to send RTP instead of the monitored phone.  The sequence is as follows:

  1. Application presses share-talk button to take control of the voice path (potentially interrupting the speaker)
  2. Application sends tone as RTP to the media server or media gateway
  3. Application presses share-talk button to return control of the voice path to the phone

There is more information on the share-talk feature in the appropriate DMCC programmers guide.

Note:  Anything the agent says while the tone is being played (more specifically between when the share-talk button presses are processed by Communication Manager) will be lost.  This can have undesirable effects.

Are there any guidelines when adding Listeners when Registering?

Be aware of the following things:

  • Add listeners prior to registering so that events are not missed after the registered event is received.
  • If the device is unregistered (either by the application actions or other factors such as loss of connectivity with CM), note that the Listeners that have been added are still associated with the device. All that is needed is to re-register. It will cause problems if the listeners are added again. Alternatively, disconnect the listeners prior to restarting the process of re-registering.

During the Session Inactive state what happens with CM events destined for the application?

If the AE Services Server does not receive requests from the application or keep-alive messages and the session enters the inactive state, if events are received from CM for devices being tracked by the session they are not queued (i.e. they discarded). If the session is reestablished prior to the session being terminated, the application must reacquire the state information for the device(s) that it was monitoring. Thus the application will need to issue getLampState, getHookswitch and getDisplay requests to update any state information the application was keeping. It is also a good practice to issue a getButton and verify that there have been no provisioning changes to the device that was being monitored at regular intervals including after a condition where the service has gone inactive. Note that it is not necessary to reconnect listeners when the session is reestablished.

I'm receiving a "java.net.NoRouteToHostException: No route to host" exception, how can I fix it?

Normally the Exception "java.net.NoRouteToHostException" according to Sun/Java: "Signals that an error occurred while attempting to connect a socket to a remote address and port. Typically, the remote host cannot be reached because of an intervening firewall, or if an intermediate router is down"

To disable the firewall on the CMAPI connector server (2.1 and prior) run the following command:

/sbin/service iptables stop

Review your network for an unanticipated firewall or misconfigured router.
Make sure the destination IP address is correct using ping or similar technique.
Review the /etc/hosts file for a misconfigured AE Services Server.
If using a DNS name, make sure that the domain name server is operational and accessible from the application server.

How can I monitor the XML being sent and received by the AE Services Server (debug, log, trace)?

The following instructions are provided for developers working with "lab" or development machines. Turning "up" logging is not intended to be done on production machines due to potential service impacts associated with doing so. If logging is increased on a production machine, be sure to decrease it when you are finished.

These instructions apply to AE Services release 6.3 and later. For releases prior to this, please see the FAQ How can I monitor the XML being sent and received by the AE Services Server (debug, log, trace) – pre 6.3 release?

As of AE Services release 6.3, it is possible to enable and download DMCC traces using the AE Services OAM administration web page.

To change the trace level, login to the OAM Administration website and navigate to Status > Log Management. Next, change the value for “XML Logging” in the DMCC section to “Finest” and click Apply Changes.

Log Manager screen on Management Console

On the next screen, click Apply. Once enabled/disabled through the web interface, logging will automatically begin/end immediately.

When you are done collecting logs at a higher log level, remember to return to this web page and reduce the logging filter to "FINE".

To retrieve DMCC traces from AE Services, login to the OAM Administration website and navigate to Status > Logs > Error Logs. Select "dmcc-trace.log.0" and click Download. Click Here to download the trace as a zip file to your computer.

Note that AE Services uses a rolling log file technique (.0 becomes .1, .1 becomes .2 and so on) to avoid creating overly large log files. Depending on the interval between problem occurrence and downloading the log file, the logfile name may have been updated. It is useful to synchronize AE Services clock with the proper time of day, and use timestamps to locate the appropriate file that contains the information of interest.

How can I monitor the XML being sent and received by the AE Services Server (debug, log, trace) - pre-6.3 release?

The following instructions are provided for developers working with "lab" or development machines. Turning "up" logging is not intended to be done on production machines due to potential service impacts associated with doing so. If logging is increased on a production machine, be sure to decrease it when you are finished.

This FAQ is valid for all releases of AE Services. However, as of AE Services 6.3, there is a simpler alternative procedure available. This is described in the FAQ, How can I monitor the XML being sent and received by the AE Services Server (debug, log, trace)?

In lieu of understanding the following instructions you may wish to use the trailing text which is an example dmcc-logging.properties file.

Changes to Logging in AE Services 5.2

Please note that this guide is based on AE Services Release 5.2 and later. For releases of AE Services before 5.2, the configuration file, /opt/mvap/conf/dmcc-logging.properties, is called /opt/mvap/conf/logging.properties and the log files, /opt/mvap/logs/dmcc-trace.log.*, are called /opt/mvap/logs/mvap-trace.log.*

Understanding how the logging levels / handlers work

The Java logger in the AE Services Server uses a concept of handlers. The idea is that various handlers know how to handle the logs that are emitting from the application. Each handler has a log level associated with it that specifies how it should filter. It won't let any logs past it that are not at least as severe as the level defined for that handler. Handlers can be chained together, in which case each handler applies its filter in turn.

The first thing the logger does when applying filters is to check to see if the calling class has a log level specified for it at the bottom of the /opt/mvap/conf/dmcc-logging.properties file. This would typically be done to enable FINER or FINEST level logging for a particular piece of code. If a level is defined for this class, that level is applied as a filter. If not, the global .level setting is used from the top of the file.

Next, the logger will send the log to each of the handlers that has been entered on the "handlers" line. These handlers will each apply their own filters and rules, and do the appropriate thing with the logs.

DMCC handlers

DMCC has three handlers specified on its "handlers" line:

  • ThreadedHandler: This is the handler that ends up going to the dmcc-trace.log.* files. It ends up actually writing to the log file on a different thread so that it doesn't cause the main DMCC threads to back up because of heavy log traffic. It is chained with the FileHandler, which is the handler that actually writes to the file. If you want to change the log level for the dmcc-trace.log.* file, you have to change the level for both the ThreadedHandler and the FileHandler, since they each apply their filter. By default, FINE logs and above get logged here.
  • ErrorFileHandler: This is the handler that ends up going to the dmcc-error.log.* files. This uses the MemoryHandler behind the scenes. All logs down to the FINER level are held in memory and are not written to the file. If a WARNING level log is received, however, the last 500 entries at FINER level or above are pushed to the log file. This allows the individual who is debugging to see some context of what might have caused the problem.
  • ApiFileHandler: This handler logs all API calls to a file.

How to increase the level of logging

In general, if you want to increase the detail of the logging on DMCC, you'll want to change what's getting logged to the dmcc-trace.log.* files. You'll need to edit the /opt/mvap/conf/dmcc-logging.properties file on the AE Services Server, and then restart the server. There are two ways you might want to do turn up logging:

  • Turn up the log levels for all classes: If you want to increase the log level for all classes, you need to change the level for the ThreadedHandler and for the FileHandler. If you want to go all the way to FINEST, you'll also have to change the global level to FINEST.
  • Turn up the log level for individual classes: If you only want to increase the log level for some classes, you need to add lines at the bottom of the file that are of the form <packagename.classname>.level = FINEST (e.g. com.avaya.mvcs.proxy.CstaMarshallerNode=FINEST). You'll then have to change the levels for the ThreadedHandler and FileHandler to not filter out these logs. Note that you may then have to change the global level to FINE, if you don't want all the FINER logs from the other classes. This is necessary since you relaxed the restrictions on the ThreadedHandler and FileHandler.

To see all XML messages coming in and out of the AE Services server

Add the following to the bottom of the dmcc-logging.properties file in the AE Services server in the /opt/mvap/conf directory:

# #################################################
# Enable tracing of all XML messages into the dmcc-trace.log.* files
com.avaya.mvcs.proxy.CstaMarshallerNode.level=FINEST
com.avaya.mvcs.proxy.CstaUnmarshallerNode.level=FINEST
# #################################################

Before you will see the output you need to change the com.avaya.common.logger.ThreadedHandler.level and com.avaya.common.logger.FileHandler.level to FINEST:

com.avaya.common.logger.ThreadedHandler.level=FINEST
com.avaya.common.logger.FileHandler.level=FINEST

Then you probably want to reduce the global filter level to FINE to avoid getting all FINEST level eventing for handlers that have been set to FINE:

.level=FINE

The log files are kept in /opt/mvap/logs on the AE Services. A maximum of 20 log files are kept. The most recent is in dmcc-trace.log.0 and they wrap to a few file based on total file size.

Please remember to return logging to the default level when you are done.

Enabling the New Logging Settings

The new logging levels/information can be enabled by one of the following:

  • Restart the AE Services as the root user via the following command line interface: [root@youraes ~]# /sbin/service aesvcs
  • Starting with Build 37-1 of AE Services 3.1, it is possible to change the logging levels without a service disruption. To do this, restart the DmccMain JVM from the command line as follows as a root user: [root@youraes ~]# jps
    3250 Bootstrap
    3732 run.jar
    3707 WrapperSimpleApp
    5552 Jps
    4119 SnmpAgent
    3649 Main
    3466 LcmMain
    8035 DmccMain
    [root@youraes ~]# kill -12 8035

BEWARE: If you are tailing dmcc-trace.log.0, this files rolls over to a new file, hence you will have to restart your "tailing".

The following information is a sample dmcc-logging.properties file from a release 6.2 AE Services server with the above changes applied to it. Note that future releases of AE Services server may have additional filters in the dmcc-logging.properties file so you may wish to be careful when applying this example to other releases of an AE Services server. The lines highlighted in RED have the changes described applied.

Example logging.properties file:

############################################################
# DMCC Server Logging Configuration File
############################################################

############################################################
# Global properties
# Default global logging level.
# This specifies which kinds of events are logged across
# all loggers. For any given facility this global level
# can be overriden by a facility specific level
# Note that the ConsoleHandler also has a separate level
# setting to limit messages printed to the console.

.level=FINE
# lower the .level setting from FINER to FINE to reduce the extraneous
# FINEST information from the logs and just get XML.

# handlers defines a whitespace separated list of class
# names for handler classes to load and register as handlers
# on the root Logger (the Logger named ""). Each class name
# must be for a Handler class which has a default
# constructor. Note that these Handlers may be created
# lazily, when they are first used.
handlers=com.avaya.common.logger.ThreadedHandler com.avaya.common.logger.ErrorFileHandler com.avaya.common.logger.ApiFileHandler

# config defines a whitespace separated list of class names.
# A new instance will be created for each named class. The
# default constructor of each class may execute arbitrary
# code to update the logging configuration, such as setting
# logger levels, adding handlers, adding filters, etc.
#config=
############################################################

############################################################
# configure com.avaya.common.logger.ThreadedHandler

# com.avaya.common.logger.ThreadedHandler logs to its target
# Handler asynchronously (on an independent thread),
# preventing server threads from blocking for disk I/O
com.avaya.common.logger.ThreadedHandler.target=java.util.logging.FileHandler

com.avaya.common.logger.ThreadedHandler.level=FINEST
############################################################

############################################################
# configure java.util.logging.FileHandler
# level specifies the default level for the Handler (defaults to Level.ALL).
# filter specifies the name of a Filter class to use (defaults to no Filter).
# formatter specifies the name of a Formatter class to use (defaults to java.util.logging.XMLFormatter)
# encoding the name of the character set encoding to use (defaults to the default platform encoding).
# limit specifies an approximate maximum amount to write (in bytes) to any one file. If this is zero, then there is no limit. (Defaults to no limit).
# count specifies how many output files to cycle through (defaults to 1).
# pattern specifies a pattern for generating the output file name. (Defaults to "%h/java%u.log").
# append specifies whether the FileHandler should append onto any existing files (defaults to false).

java.util.logging.FileHandler.level=FINEST
java.util.logging.FileHandler.pattern=../logs/dmcc-trace.log
java.util.logging.FileHandler.limit=10485760
java.util.logging.FileHandler.count=20
java.util.logging.FileHandler.formatter=com.avaya.common.logger.MillisecFormatter
############################################################

############################################################
# configure com.avaya.common.logger.ErrorFileHandler
# This handler contains code that uses a MemoryHandler that
# pushes to a ThreadedHandler whose target is a FileHandler
# with the pattern specified here. The level set here
# is propagated through the entire Handler chain.
# The result is a log containing detailed error pretext.
com.avaya.common.logger.ErrorFileHandler.level=FINER
com.avaya.common.logger.ErrorFileHandler.pattern=../logs/dmcc-trace.log
############################################################

############################################################
# configure java.util.logging.MemoryHandler
# filter specifies the name of a Filter class to use (defaults to no Filter).
# level specifies the level for the Handler (defaults to Level.ALL)
# size defines the buffer size (defaults to 1000).
# push defines the pushLevel (defaults to level.SEVERE).
# target specifies the name of the target Handler class. (no default).
java.util.logging.MemoryHandler.level=FINE
java.util.logging.MemoryHandler.size=1000
java.util.logging.MemoryHandler.push=WARNING
############################################################

############################################################
# configure com.avaya.common.logger.ApiFileHandler
# This handler is a ThreadedHandler whose target is a
# FileHandler with the pattern specified here. The level set
# here is propagated to the FileHandler. By default, this
# Handler is configured with a filter to log all API calls.
# filter specifies the name of a Filter class to use (defaults to no Filter).
com.avaya.common.logger.ApiFileHandler.level=FINE
com.avaya.common.logger.ApiFileHandler.pattern=../logs/dmcc-trace.log
com.avaya.common.logger.ApiFileHandler.filter=com.avaya.common.logger.RegExFilter
############################################################

############################################################
# configure com.avaya.common.logger.RegExFilter
# Filters LogRecords by matching their Logger name using the
# regular expression specified in the pattern property.
com.avaya.common.logger.RegExFilter.pattern=^com\.avaya\.api.*
############################################################

############################################################
# Facility specific properties (extra control per logger)
#com.xyz.foo.level = SEVERE
sun.rmi.level = WARNING
com.avaya.platform.jmx.Mx4jXSLTProcessor.level = WARNING
############################################################

############################################################
# Enable tracing of all XML messages into the dmcc-trace.log.* files
com.avaya.mvcs.proxy.CstaMarshallerNode.level=FINEST
com.avaya.mvcs.proxy.CstaUnmarshallerNode.level=FINEST
############################################################

How can the application tell if the de/activation of Service Observing (SO) was successful?

You can monitor for Physical Device Events, specifically Lamp Mode events, and look for the following:

  • If you are enabling Service Observation via a feature access code, then the active call appearance (where the feature access code has been "dialed") will change from steady green (Lamp Mode = steady, Lamp Color = Green) to dark (Lamp Mode = off).
  • If enabling via an SO button on the CMAPI phone, then the lamp (Lamp Mode) associated with the SO button flutters. If a feature access code is used and the station has a SO button provisioned, it will flutter as well.
  • If the SO enabling is unsuccessful, then the lamp will be steady green for the period of time when the switch is playing an intercept tone to the CMAPI phone.

Are there plans in the future to also supply a COM/C# or .NET based API, similar to the Java API?

The DMCC .NET API has been released with the 4.1 version of the AE Services software. It can be found on the Release 4.1 AE Services page at the following URL:Avaya Aura Application Enablement Services.

Does the XML-Interface expose exactly the same functionality as the Java-API and .NET API?

Yes. The same capabilities are available in all three DMCC APIs. The Java and .NET APIs provide simplifications to some interfaces such as monitoringServices and session handling. The Java and .NET APIs send and receive XML to AE Services. Programmers using the all three APIs have access to the same capabilities.

Can I utilize SIP Trunks with a CMAPI Tone Detecting (IR) Application?

You can use a SIP trunk (where DTMF over IP is set to inband RTP [i.e. "rtp-payload" on the signaling group form] -- note that out-of-band is not supported on SIP trunks).

The CMAPI tone detection (ttd_mode) should be set to "out-of-band" for proper delivery of DTMF from TDM sources and IP trunk sources configured as "in-band." The "Intra-System IP DTMF Transmission Mode" setting in the system-parameters ip-options form must be set to "out-of-band" when ttd_mode is "out-of-band"

The "DTMF over IP" setting in the signaling group applies to the trunk that the signaling group is associated with. IP trunks connect two separate switches that are usually a considerable distance apart and so, a compressed codec like G.729 is usually used on the IP trunk. In-band DTMF detection in G.729 streams is not always reliable, and so most people use out-of-band DTMF on IP trunks.

SIP trunks can only support in-band RTP. This is different from in-band tone detection. So essentially there are three different modes in CM:

  1. In-band RTP (through RTP header)
  2. In-band (through payload itself)
  3. Out of band (through h.323 signaling)

CM will always signal tones out-of-band to an H.323 IP phones. IP (H.323) Phones will not do in-band tone detection. Similarly, tones originating from an IP (H.323) phone are always out of band.

With respect to CMAPI endpoints, tones will always be passed out of band to CMAPI endpoints if the far end is an IP (H.323) endpoint or sip trunk.

In CM 2.1, the settings on the ip-options form is for inter-gateway calls as well as calls between digital endpoints and CMAPI stations.

Also in CM 2.1 the settings on the signaling group form is for IP (H.323) trunks only.

CM 3.0 all tones are out of band for CMAPI endpoints. Note that IP phones always send tones out of band. This form (signaling group) and the (ip-parameters) form, apply only to Circuit Switched endpoints communicating with a CMAPI endpoint. With CM 3.0, load 333, this administration will be disabled.

What invokeID do events have?

All events from the CMAPI/AE Services server have an invokeID of 9999.

Are responses to commands immediate and guaranteed?

Responses are not necessarily immediate and in the same order as the requests were made. The invokeID can be used to correlate responses to requests.

Call related events may be in different order from one call to the next (ringing, lamp and display updates are one example).

Applications will receive a response to a command/request in the form of a positive response or a negative response. Most negative responses come in the form of exceptions. There are a few cases of explicit negative responses, such as StartApplicationSessionNegResponse.

It is also possible for events related to application of the request to be received before the response for that request.

Please take a look at the XML Programmers' Guide, which discusses the request/response framework.

What services/modules do I need on the application server to converse as a phone for my soft phone application?

If you're interested in implementing your own CMAPI soft phone and using it to converse as a phone in order to access the media stream from Communication Manager, then you need to first register your CMAPI application in exclusive control mode of the device (extension), with client media mode. In this mode, your application will have exclusive control of the signaling interface for an extension (which first needs to be administered in the switch) and will send and receive the RTP stream to and from a RTP address that the application specifies when registering.

Are there additional services an application needs to invoke when using client media mode?

Because the client application is terminating the RTP stream, it will need an RTP stack. The developer can choose from the Avaya-provided Media Stack, a 3rd party stack, or the developer can implement their own. The application should also monitor for Media Control Events (MediaStart and MediaStop) - these will provide the application with the far end RTP information. The "simple IVR" Java sample code included with the SDK (or the RPTC .NET Sample App) may be a useful starting point for programmers to work from.

How do I dial digits using the XML/Java interface?

Digits are dialed by utilizing button presses using the button id's for the digits. Please see the section 'Making a call' in the XML or Java Programmers Guide.

Which IP address and port should be used to monitor the IP/Phone?
Is it the IP address and port of the AE Services Server running Linux or the media server or the phone?

The application should use the AE Services server IP address and CMAPI port (in release 3.0 default port is 4721, for 3.1 use the secure port of 4722) to connect to a release 3.0 AE Services server - see the 'Establish a connection to the connector server' section in the XML Programmers Guide.

The application first establishes an application session in order to exchange applications messages (requests, responses, and events) with the AE Services server - see the 'Establishing an application session' section in the XML or Java Programmers Guide. For Java applications using the API, the starting a session is handled within the API during the getServiceProvider() request.

Then the application issues a getDeviceID request for each of the devices that it needs to control. A deviceID request provides the following pieces of information:

  • One of the following two:
    • IP address of a 'C-LAN' or PROC interface on the switch (Communication Manager)
    • A 'switch connection name' administered on the AE Services Server (this will become the preferred method in release 3.1).
  • An extension on Communication Manager, i.e. extension of the phone you want to control.

    A release 3.0 deviceID for extension 1006 on a C-LAN with IP address of 10.30.91.100 looks like:
    1006::10.30.91.100:0
    Note: For a G700/S8300, there is a single C-LAN function, so the application developer should use the IP address of the S8300 Media Server. For larger PBXs such as the S8500, S8700, and S8710 Media Servers paired with MCC1/SCC1/CMC1/G600/G650 Media Gateways, there are usually several C-LAN interfaces. In that case, the application developer would use the IP address of one of the C-LAN boards. The optional "switch connection name" lets the application developer just specify the switch your extension is on and the AE Services server will select a C-LAN for the application - though the C-LAN list ('H.323 gatekeeper list') which must be administered on the AE Services server. See the section on 'Getting device identifiers' in the XML or Java Programmers Guide.

    A good way to see the XML message exchanges between a CMAPI client application and the AE Services server is to run one of the sample JAVA applications and use Ethereal to monitor and capture the packets between the client and AE Services server. Then extract the XML messages from the captured stream.

    Another method to observe the exchange of information between the CMAPI/AE Services server and the application is to increase the logging level on the server. See the article "How can I monitor the XML being sent and received by the AE Services Server"?

    The CMAPI Java SDK has the sample apps along with README instructions. The sample application most likely to provide a good starting point is the Soft Phone sample application.

Can an IP softphone be monitored by CMAPI?

Monitoring of IP Softphones is not supported up through release 3.1 by CMAPI (Call, Device and Medial Control API). Attempts to register for control of an IP Softphone will cause the IP Softphone to be unregistered, and the CMAPI registration attempt will fail.

Can you provide a basic architectural overview of CMAPI?

CMAPI has undergone quite a few naming changes from one release to the next. In all the articles in this FAQ, CMAPI is synonymous with Device and Media Control and Call, Device and Media Control.

The simplest CMAPI application requires an extension on the switch (Communication Manager). An application may use more than one extension on the switch depending on the nature of the application. The extension may be associated with a physical port (as in the case of a DCP phone) or it may be an IP endpoint. Therefore, those extensions must first exist (i.e. be administered) in the switch translations/administered data before they can be successfully registered to by the application.

The CMAPI application sends a GetDeviceID request to the CMAPI/AE Services server; the request includes that station's extension along with an identifier for the switch that extension resides on (an IP or DNS address or the "switch connection name"). After the application receives a response to the getDeviceID request, it can configure monitoring services for the device, and then register the deviceID (CMAPI application) with the CMAPI server via a RegisterDevice (3.0) or RegisterTerminalRequest (3.1) request. The register device request contains the deviceID and password for the extension. The CMAPI server in turn registers the CMAPI application with the identified switch using the extension and switch identifier that were provided in the GetDeviceID request.

While handling the register request, the extension and password are used to validate the information provided against the referenced CMs translations. If the credentials are valid and the extension is properly configured (IP Softphone enabled, and a DCP or IP station) that is in an in-service maintenance state the registration request succeeds.

Applications should not be run on the AE Services server. The only exception would be the use of the SDK sample applications running on the AE Services server for the purpose of troubleshooting or education. The typical network architecture involves one (or more) application servers feeding one (or more) AE Services/CMAPI servers which in turn feed one (or more) Communication Managers.

What is button 262?

During the lamp refresh audit, an application will see button 262?s lamps get updated. Button 262 corresponds to the message waiting lamp which is implemented in CM as a button. The application should use the Get Message Waiting Indicator method (or get-message-waiting-indicator.xsd) within the Physical Device services to monitor the message waiting state and not depend on the updates to button 262.

Is it possible to control the "Mute/Unmute", "Exit", "Previous", or "Next" buttons (right arrow button) button through DMCC?

No, it is not possible to monitor the "Mute/Unmute", "Exit", "Previous", or "Next" buttons (right arrow button) through DMCC. These are functions local to the phone and thus to the particular (soft)phone implementation. There is currently no way to control these buttons via DMCC. These are 'local' keys on the phone and no signaling goes to/from Communication Manager when the key is pressed. AE Services monitors the changes occurring in Communication Manager and if the phone does not inform Communication Manager, AE Services does aware of the button push. A list of other 'local' keys include buttons such as 'MUTE', 'Headset', 'Volume Up/Down', 'Options', 'Contrast' controls (on some phones), 'Redial', 'Page Left/Right' and the four soft keys under the display panel.

Is it possible to collect DTMF tones in "telecommute" mode?

It is not possible to collect DTMF tones for a device that is registered in "telecommute" mode either using DMCC .NET or Java SDKs. To collect DTMF tones, the device must be registered in exclusive control mode. The application can then choose either "Server media" mode, where the connecting server handles the media or "Client media" mode, in which case the application needs to process the media all by itself.

Is there any sample code or reference material on how to write an IP Softphone which will work in 'Roadwarrior' (client media) mode?

The DMCC Java SDK includes a "Softphone" sample code located in the directory "examples\src\com\avaya\cmapi\softphone" which can be run in either Shared or Exclusive control mode. However, this sample code does not include the implementation of the audio path in exclusive mode using client media. You will need to use a third-party stack (e.g. Java Media Framework) capable of doing RTP or use the "clientmediastack" sample code located in "examples\src\sampleapps\clientmediastack" directory included with the DMCC Java SDK. Within the DMCC .NET SDK there is sample code for a "simple call recording" application. This application utilizes client media mode to send and receive RTP data from the application. While the .NET API handles many of the details regarding the packet handling for RTP, the developer can observe much of the logic and learn from it. If the developer is using the DMCC .NET SDK, they can utilize the library calls provided by the API to handle the lower layers of the RTP media stream.

Where can I find information regarding upgrading CT Server 2.x CMAPI applications to AES Release 3.x?

You can refer to "Avaya Multivantage™ Application Enablement Services Device, Media and Call Control API XML Programmer Guide R3.1.1" document 02-300358 issue 2.2 dated May 2006 (http://support.avaya.com/elmodocs2/AES/3.1.1/02_300358_2_2.pdf). Within that document read "Appendix B: Migrating Communication Manager API 2.1 Applications to Application Enablement Services 3.0" and "Migrating from AE Services 3.0 to AE Services 3.1". The items which may need modifications during a conversion from CMAPI 2.X to DMCC 3.X are covered here. There is a similar Appendix in the "Avaya Multivantage™ Application Enablement Services Device, Media and Call Control API Java Programmers Guide R3.1" for java applications. Also reading the 'What's new' section of the 3.0 version of the programmer's guide, paying careful attention to the 'Application-affecting changes' section, will also be helpful.

Does the AE Services 4.0 DMCC (formerly CMAPI) XML SDK support Call Control services?

The AE Services 4.0 DMCC XML SDK only supports the first party Call Control services, such as, making a call, answering a call, and so on. The AE Services 4.0 DMCC XML SDK does not support the third party Call Control services. These third party Call Control services will be available in the AE Services 4.1 DMCC release. Use the 'dashboard' application, which is bundled with the dotNET SDK, to get a preliminary view of the functionality that is planned to be provided in the AE Services 4.1 DMCC release.

While doing the Single Step Conference through DMCC, is there any indication on the Agent phone that it is in conference with another party?

In the "Full Participation" mode, the display on the Agent phone shows 'conference #' the moment conference call is established and notification is sent to the various parties that are in the conference. However, in the "Silent" mode, no notification is received on the Agent phone.

How can an application determine the active parties participating in the call, if the extension number is bridged to multiple stations and any one can answer the call?

Using Avaya Device, Media and Call Control (DMCC) XML APIs, an application can make a 'snapshotCall' request and information about all the parties in the referenced call will be returned in a 'SnapshotCallResponse' element. Element 'snapshotCallResponseInfo' is repeated for every party in the call. Inside this, element 'localConnectionInfo' shows 'Connected' state for all the active connections. For inactive members, this field shows 'null'. The XML response for this scenario is provided below:

<SnapshotCallResponse xmlns="http://www.ecma-international.org/standards/ecma-323/csta/ed3"> <crossRefIDorSnapshotData> <snapshotData> <snapshotCallResponseInfo> <deviceOnCall> <deviceIdentifier
typeOfNumber="explicitPrivate:localNumber"
mediaClass="notKnown" bitRate="constant"> 2022:CMSIM::0</deviceIdentifier> </deviceOnCall> <callIdentifier> <deviceID typeOfNumber="other"
mediaClass="notKnown" bitRate="constant"> 2022:CMSIM::0</deviceID> <callID>1016</callID> </callIdentifier> <localConnectionInfo>connected</localConnectionInfo> </snapshotCallResponseInfo> <snapshotCallResponseInfo> <deviceOnCall> <deviceIdentifier
typeOfNumber="explicitPrivate:localNumber"
mediaClass="notKnown"
bitRate="constant"> 2009:CMSIM::0</deviceIdentifier> </deviceOnCall> <callIdentifier> <deviceID typeOfNumber="other"
mediaClass="notKnown" bitRate="constant"> 2009:CMSIM::0</deviceID> <callID>1016</callID> </callIdentifier> <localConnectionInfo>null</localConnectionInfo> </snapshotCallResponseInfo> <snapshotCallResponseInfo> <deviceOnCall> <deviceIdentifier
typeOfNumber="explicitPrivate:localNumber"
mediaClass="notKnown" bitRate="constant"> 2023:CMSIM::0</deviceIdentifier> </deviceOnCall> <callIdentifier> <deviceID typeOfNumber="other"
mediaClass="notKnown" bitRate="constant"> 2023:CMSIM::0</deviceID> <callID<1016</callID> </callIdentifier> <localConnectionInfo>connected</localConnectionInfo> </snapshotCallResponseInfo> </snapshotData> </crossRefIDorSnapshotData> </SnapshotCallResponse>

Refer to 'LocalConnectionState' element in the XMLdoc (bundled with Avaya DMCC XML SDK) for further details.

Why does my application receive display updates after the call is answered and not before?

This behavior occurs when the 'Idle Appearance Preference' station option is set to 'Y'. This option is present on the second page of the 'display station XXX' form, The 'Idle Appearance Preference' option causes the Communication Manager (CM) to pre-select a call appearance that is idle when the call is offered to the station. This prevents the display information from being sent when the call begins to ring. When 'Idle Appearance Preference' is set to 'N' and a call is offered to the station when the station is idle, CM pre-selects the ringing call appearance, which triggers the display information for the call to be sent at the same time that the station begins to ring. 'Idle Appearance Preference' set to 'Y' users prefer to manually choose to when an off-hook should answer a call (by pressing a button), versus originate a new call (allow the pre-selected idle appearance operation). When a station has Idle Appearance Preference' set to 'Y', display updates are sent after the ringing call appearance is selected. Since DMCC requires that the station be speaker equipped, selecting the ringing call appearance triggers the call to be answered in addition to triggering the display to be updated.

Can you explain why our Single Step Conference (SSC) request works with 3.1 AE Services but not with a 4.1 AE Services?

In AE Services 4.1 a hole in the security checking of the deviceID imbedded in the connectionID (activeCall) element of the SSC request was closed. The application had been sending a zero in place of the deviceID in the connectionID. In AE Services 4.1 the system checks the provided deviceID to see if it is allowed to be controlled by the login that the application provided during the StartApplicationSession request. In the case where the application is providing an invalid device identifier (e.g. zero), the test fails, and the application's SSC request is rejected.

How can an application construct a connectionID to be used in a DMCC request?

As a general rule connectionIDs should be copied from some other DMCC message and used. An application should avoid constructing any CSTA identifier such as a deviceID or connectionID. Avaya may change the format of these identifiers (particularly the deviceID), at some future point, which could break any application that is leveraging constructing or parsing the device identifier for information.

If an application is taking callIDs acquired from TSAPI/JTAPI and using them to construct a connectionID for DMCC, care should be taken with the deviceID element in the connectionID. The recommendation is to utilize a deviceID provided by the AE Services server from some response or event. Valid deviceIDs can be constructed using the getDeviceId() and getThirdPartyDeviceId() requests. The response to a snapshotCall() request will also contain deviceIDs that can be used when it is necessary to construct a connectionID. Another source for valid deviceIDs is the DeliveredEvent or the EstablishedEvent from Call Control services.

Why there is a difference in XML schema definitions (XSD) for some elements, like 'get-status-link-response', in AE Services release 4.0 and 4.1?

The XML Schema standard provides '< xsd:any >' as a wildcard element. '< xsd:any >' enables schemas to be extended in a well-defined manner. Any syntactically correct XML can be inserted in place of '< xsd:any >'. This type of element allows loose coupling, enables versioning and provides flexibility where XML APIs are evolving. The development team began to encounter tools that were throwing errors on encountering these tags. The tags were in the XSDs for backward compatibility in the event there is extension to the XSDs in the future. In digging through the issue it was determined that solution was to give up on trying to use '##any' to allow room for backward compatibility, so it was removed everywhere it occurred.

What is the solution to the error: "protocol version incompatible", which prevents clicking the 'Get Dev. ID' in the dashboard?

This response may be received in response to a request when running a higher version of DMCC SDK against a lower version of AE Services server. AE Services server supports backward compatibility. Running a lower version of DMCC SDK against a higher version of AE Services server will work.

What is the difference between the Application Enablement Services (AE Services) server's 'server media mode' and 'client media mode'?

When an application uses server media mode, most of the media processing is done by the AE Service server. In client media mode, the application needs to handle media processing by itself. Client media mode gives the application freedom to select some of the codecs (like g.723) which are available only in client media mode. Additionally, in client mode, the application has better control over the recording location, can be more responsive to user input, and support better scalability than by using server media mode. The application can use a third party utility to store the media related files.

If an application is recording in Server Media Mode, where are the audio files stored? How can these recordings be accessed?

In server media mode, media files are stored in the '/tmp' directory (by default) on the AE Service server. The user can specify any existing properly configured directory for storing these media files using the OAM administration web page. Login to OAM Administration website and navigate to 'CTI OAM Home > Administration > DMCC Configuration > Media Properties'. Next, configure the player directory and the recording directory where media files are stored. These files then can be extracted (copied off the AE Services server) using SSH or SFTP.

During recording, by default the AE Services server generates the file names having the format: "[timestamp] [extension].wav". The generated file name is returned in RecordMessageResponse message which can be then used by the DMCC application. This file name can also be found in the StopMessage event. The DMCC application can also specify appropriate file name in the RecordMessage request. File names specified for the recorded files must be relative to the configured directory and the configured directories must already exist. Recordings cannot overwrite an existing file. If an application needs to play back these recorded messages, the PlayMessage request should be used with the same filename saved earlier during recording.

Please refer the document Avaya Application Enablement Services Device, Media and Call Control API XML Programmer Guide R4.1 An Avaya MultiVantage® Communications Application 02-300358 Issue 3.0 December 2007.

When do I need to supply the 'switchName' field value while invoking getDeviceID request?

It is recommended that the 'switchName' always be used The 'switchName' field in the device ID is presently only required for Call Information services, Call Control Services, Snapshot Services and Logical Device Feature Services, . If an application is not using any of these services and does not wish to take advantage of the round-robin H.323 Gatekeeper assignment feature, it is not required to administer an H.323 gatekeeper list or specify a switchName in the GetDeviceID request.

To add a H.323 GateKeeper address, use a web browser and navigate the AE Services web page to 'CTI OAM Home > Administration > Switch Connections'. Then select the appropriate connection from the list and click on the 'Edit H.323 Gatekeeper' button. Add the IP address(es) of the C-LAN(s) or procr interface on CM that are to be used for the H.323 device registrations for DMCC devices. The application can then use the switchName provisioned in AE Services when making getDeviceID requests for that CM.

For further information on this, please refer to section 'Populating the Switch Name field', chapter 3 of document Avaya Application Enablement Services Device, Media and Call Control API XML Programmer Guide R4.1 An Avaya MultiVantage® Communications Application 02-300358 Issue 3.0 December 2007.

How would an application go about recording a call at specific device?

A white paper has been written on this topic. It is recommended reading for anyone interested in call recording methodologies:

DevConnect Developer Guide: Developing Client-side Call Recording Applications using Avaya Application Enablement Services (899 KB .pdf)

If the application registers the device (extension) in exclusive mode and specifies "client media" (and provides an IP address for the RTP media to be directed to), then the application will receive the media streams for calls that device is involved in. As of AE Services release 4.0 exclusive mode was renamed "Main" mode. An alternative approach to this method is described as Method 3 below.

There are three design approaches to performing call recording using DMCC APIs and CM. The first two create an extension and cause that extension to be "conferenced" into a call. This conferencing can be done with service observing or the single step conference feature. Applications that utilize these approaches will implement one soft phone per recording device.

  • Method 1: Using service observing
    The soft phone has a pre-provisioned service observing (SO) button on the recording device/soft phone. The SO button is provisioned with the destination station that they wish to record calls for. When the application initializes, it activates the SO feature. Then when calls arrive at the observed extension the application is automatically notified of the call arrival and it answers the call in conjunction with the destination station answering. In Communication Manager 4.0 the maximum number of Service Observers in a call was increased to two.
  • Method 2: Using SingleStepConference
    In this solution the application uses call control services of TSAPI, JTAPI or 4.1 DMCC to monitor the recorded station for incoming calls and/or a specific button push that indicates the user wishes recording to begin. The application then invokes the TSAPI/JTAPI Single Step Conference (SSC) feature to conference in the recording device DMCC station when the recorded station is active on a call.

    These solutions will work if the monitored extension is making use of telecommuter mode, or direct media (either TDM or IP).

  • Method 3: Multiple Registrations for an Extension
    A third approach is available with AE Services 4.1 and CM 5.0. This approach's advantage is it does not encounter limitations imposed by Communication Manager's maximum party count for Service Observers or parties in a call. In this approach a client media mode endpoint in dependant or independent dependency mode is registered against the extension that the application wishes to record. The application provides RTP address information and codec information as part of the registration sequence. When calls are handled by the registered extension, the application receives a copy of the audio of the call. This audio stream is a sum from all parties (including the extension that the application registered as). Through DMCC's call control services information about the participants of the call can be discovered.

Note that within the .NET SDK there is sample code for a SimpleRecord application which implements method 2. If you are interested in using this method, observing the behavior of this sample code is strongly recommended.

In what scenarios will the application receive an InvalidDeviceIDException?

An InvalidDevicIDException is returned when the deviceID supplied in the request is improperly formatted (e.g. missing or incorrect fields). Some applications have been manufacturing their own deviceIDs when interfacing to both TSAPI/JTAPI and DMCC. Avaya reserves the right to change the format of deviceIDs in the future, thus the application should not format its own deviceIDs based on a perceived understanding of the current arrangement of internal fields. It is recommended that the application use GetDeviceID or GetThirdPartyDeviceID requests to get the correct device ID. The former request is used for first party control and the later can be used for third party control. For further information on this, please refer to section 'Populating the Switch Name field', chapter 3 of document Avaya Application Enablement Services Device, Media and Call Control API XML Programmer Guide R4.1 An Avaya MultiVantage® Communications Application, 02-300358, Issue 3.0, December, 2007.

How can frequency, pitch, or the tone of the voice in a conversation on a real time basis be collected or measured in the AE Services server using DMCC Java SDK?

Avaya does not provide API capabilities to collect or measure frequency, pitch, or the tone of the voice in the conversation on a real time basis. The application must implement or interface to a third party application that provides these services.

In order to implement such a solution using the DMCC SDK, the user application needs to register the DMCC station in "Client Media Mode", and then pass the received RTP information through the analytical function.

Avaya provides a mixed RTP voice stream of all call participants, so the analysis must take this into account when processing the data. It is not presently possible to access independent voice streams for specific call participants through DMCC or other APIs available from AE Services.

Is an AE Services server required to build and run a H.323 softphone client application using the Device, Media and Call Control (DMCC) SDK?

The DMCC SDK always communicates with the AE Services server and hence the server is required for both the development and production environments. The Avaya Aura Basic Development Environment (BDE) can be used while developing an application as it contains both an AE Services server and a Communication Manager server. The softphone client created using the DMCC SDK is not an H.323 softphone. The AE Services server communicates with the Communication Manager using H.323 and other protocols, but the messages between the DMCC client and the AE Services server are actually in XML format and use CSTA Phase III concepts.

Is there a way to indicate call progress events in DMCC, similar to the mechanisms provided by H.323, SIP or ISDN?

Call progress indication can be viewed using Device, Media and Call Control services application using DMCC protocol version 4.1 or later. With the DMCC API, the application can use call related events to be informed of call progress event information such as: OriginatedEvent, DeliveredEvent, EstablishedEvent, ConnectionClearedEvent, TransferredEvent, ConferencedEvent, etc.

How can a user get the full details of the display as it is shown on the hard phone?

The following are the steps to retrieve the contents of the display:

  1. For DMCC Java SDK, the DisplayUpdatedEvent.getContentsOfDisplay() method can be used to provide the display contents.
  2. For DMCC XML SDK, the contentsOfDisplay element in the DisplayUpdatedEvent event has to be parsed to retrieve the contents of the display.
  3. For DMCC .NET SDK, the DisplayUpdatedEventArgs.getContentsOfDisplay() method can be used to provide the display contents.

What is the length of the display on a telephone?

There is no limitation on the length of the display from the SDK side. Whatever the phone displays will be captured by the DMCC SDK in the DisplayUpdatedEvent event structure. Hence, this is independent of the phone set type. Within Communication Manager, the display information is dynamically generated based on the available information and the station type. Typical display areas are between 24 and 80 characters. However, some station types support multiple display lines, and thus the DMCC API may provide a longer field in the DisplayUpdatedEvent method.

How can the real number of the external party be determined in an OFF-PBX call?

For incoming calls, the Delivered Event and Established Event provides the ANI (Automatic Number Identification) of the calling device when it is available. For incoming calls over PRI (Primary Rate Interface) facilities, a "calling number" information element from the ISDN SETUP message or the assigned trunk identifier is specified in the event. If the "calling number" does not exist (i.e. it is not provided), a dynamic device ID is supplied to Delivered Event. In scenarios where the monitored party is not the original destination of the call, ANI may not be available (e.g. call transfer). In case of a conference or transfer scenario, where the initial call is not monitored, the trunk information is not always available in the call control events to the monitored party, although it may appear on the station display of the transferred to party. Even if the original call was monitored, there are call scenarios where the calling party information is not provided to the transferred to or conferenced in parties, even though it may appear on those parties' station set displays.

Using Device Media and Call Control's Display monitoring capability, an application can access the display information on a station. By using this service on the transferred to extension, an application may be able to access the external party's number. Feature buttons on Communication Manager for 'ANI Request' (ani-requst) and 'Conf Display' (conf-dsp) may be used on the station (if they are provisioned), to access calling party information in this scenario. If an application actively manipulates the display of the station by using these buttons, the end user of that station will observe the changes to the station display.

For outgoing calls, gaining access to the called number is difficult and sometimes not possible. If the application originated the call using the Make_Call or Consultation_Call methods, the application knows the digits of the external number used to place the call. The call ID provided in the response to the CTI request can be saved along with the external number that was provided to Make_Call or Consultation_Call. When someone answers the external number, the application receives, an Established_Event event and the call ID can be used to determine the called number (which may or may not be the answering party's number). This works only in case when call is initiated by the application. When the call is initiated from a physical device manually, the called number for the external party is not available.

Where can I find information about how to generate new listener objects after my standby application instance issues a TransferMonitorObjects request when recovering from a failure of the primary AE Services server?

The TransferProxy service is what your application will have to use in order to get new listener objects. Information on this service was inadvertently left out of the 4.1 and 4.2 programmer's guides. This will be rectified in the next AES release. In the interim, the DMCC Java Programmer's Reference (Javadoc) details how to use this service, and an example of how to use the service can be found in the SessionManagementApp in the Java SDK. For details on session recovery see the 4.1 or 4.2 DMCC Java Programmer's Guide section titled "Recovery."

Is it possible to monitor a single station using two AE Services servers simultaneously?

Yes, it is possible to monitor a single station using two AE Services servers simultaneously using the 3 Endpoint Registrations per Extension feature with DMCC in AE Services server 4.1.

Different DMCC applications may register (i.e. monitor) for the same extension through up to three (which is the maximum limit) AE Services servers simultaneously. Each DMCC client application establishes separate signaling paths through the different AE Services servers. Each application may optionally establish a separate media path as well which can be over a unique hardware path if the solution is appropriately provisioned. A single application may generate multiple registrations for the same device through different AE Services servers. If one AE Services server fails, the DMCC client application can continue using signaling and media paths of the second registration. Note that this is not an automatic failover of AE Services. To use this feature, make sure that the application is registered with a separate AE Services server and alternate network path connections (i.e., signaling and media paths) are also established.

An application using the same credentials (login and password), can register up to three instances of a specific device through a single AE Services server.

Please note that the 3 Endpoint Registrations per Extension feature requires Communication Manager Release 5.0 and AE Services server Release 4.1 or later. Three registrations are only available for a DCP station. An IP station or IP soft phone will consume one of the three registrations.

The three registration capability is a limit in Communication Manager. One registration is typically consumed by the physical station (or soft client e.g. one-X Communicator). In this case Communication will restrict there to being two registrations from AE Services. Alternatively, an application may use all three registrations (e.g. in independent mode) for some purpose.

Up to four (Release 6.3 and earlier) or eight (release 7.0) AE Services servers may establish a DMCC call control monitor for a single device. This is a limitation inherited from TSAPI. There are caveats to this statement. Please review the FAQ titled “How many individual applications can monitor a specific device?” in the General AE Services category.

What is the possibility that a soft phone implemented with DMCC will transparently switch over to a secondary CLAN if the primary fails?

It depends on what service(s) the failed C-LAN was providing.

If the failed CLAN was providing H.323 registration service for the device, The application will be notified through an TerminalUnregisteredEvent.There is no possibility that a soft phone implemented with DMCC will automatically switch over to an alternate C-LAN. AE Services does not currently (as of release 4.2) support automatically switching to an alternate C-LAN from the AE Services' H.323 Gatekeeper List. Hence, the application must explicitly re-register the device until it succeeds (in this case it waits for the C-LAN to come back into service). If multiple C-LANs are available the application can release the deviceID, request a new one and use an alternate C-LAN. AE Services supports a H.323 Gatekeepers list which does a round robin allocation of C-LANs to incoming deviceID requests. If the application uses this capability, the AE Services server will allocate a CLAN for the subsequent deviceID request (not necessarily a different one, so multiple re-requests may be needed). For more details see the DMCC XML Programmer's Guide available on the devconnect web portal.

If the failed C-LAN is supporting a switch connection over which a CTI-link that the application was using is provisioned and there are other C-LANs in the provisioned list of C-LANs for that switch connection, then the failure is transparent to the application. If there is insufficient messaging transfer capacity with the reduced number of C-LANs, there will be noticeable issues with the CTI-link.

If the C-LAN that is lost is the last available C-LAN in the switch connection's C-LAN list, then the application will be notified of a call information link failure, and a monitor stop event. If the monitor was created on just call control services, then only that monitor is stopped. In the event that the monitor was on multiple services, then that set of monitors is stopped. When the switch connectivity CTI-link is restored, an event is sent to a call-information monitor. At this point the application can re-establish the monitor(s) and re-register the device.

Note that call information services uses a AE Services to Communication Manager Definity API (DAPI) link which utilizes the switch connection link. The DAPI link is hidden from the view on AE Services OA&M interfaces. The DAPI link uses the same switch connection transport link that TSAPI services use, and thus the call information service link up/down events can be used to gain insight into the status of the TSAPI link, however they do not necessarily have a one to one correspondence (TSAPI may be down when DAPI is up in some instances).

How can an announcement be played every time an agent receives a call?

An announcement can be played every time an agent receives a call using the DMCC Call Control Services. Following are the steps needed to be performed to play the announcement:

  1. Register an extension (X), for playing announcement, through DMCC. The dependency mode could be either server media mode or client media mode. Client media mode is probably a better solution from a performance and scalability perspective.
  2. Monitor the Agent for receipt of an Established event where the Agent's station is the answering party.
  3. When an inbound call is established at the Agent's extension, Single Step Conference an extension registered through DMCC (X from step 1) to play the desired announcement into the call.
Note:Most often, the Agent and the calling party will both hear the announcement. To determine when to initiate the Single Step Conference request, the application should use Call Control Services. When an Established event is received for a call delivered to the Agent, the Single Step Conference should be initiated. The extension that is conferenced in can be part of a pool of DMCC soft-phones registered in the main dependency mode. The application must disconnect the DMCC announcement source once it has finished playing from the call. If an announcement extension is used, it will automatically disconnect once the announcement has completed.

While making a call from dashboard, how do I specify the extension in 'calledDirectoryNumber' field when calling an OFF PBX number?

If a call is made to an OFF PBX number, Automatic Route Selection code (ARS), Automatic Alternate Routing code (AAR) or Trunk Access Code (TAC) has to be specified in the 'calledDirectoryNumber' field.

Following is the 'calledDirectoryNumber' format:

< TAC/ARS/AAS >< Extension number >:< Communication Manager Name >:< IP Address of the communication manager >:< CSTA Extensions >
where < Communication Manager Name > and < IP Address of the communication manager > are optional fields.

Does the existing DMCC license allow deploying a stand alone application that has avaya.crt and ServiceProvider.dll included in the product deployment?

Applications may freely distribute the avaya.crt and ServiceProvider.dll files with a DMCC application. The legal allowance for this is provided in a LICENSE.txt file that is bundled with the DMCC SDK. The license file has the following specific statement that covers the distribution of these files 'Avaya further grants Licensee the right, if the Licensee so chooses, to package client files with Licensee's complementary applications that have been developed using DMCC SDK.'

How many parties can participate in a conference using Device, Media and Call Control (DMCC)?

DMCC supports a maximum of 6 parties in a conference. This limit is imposed by Avaya Communication Manager. It is not possible to increase the number of parties beyond 6 in a conference.

Is it possible to get the Agent ID of the Agent logged in to a phone using Device, Media, and Call Control (DMCC)?

The DMCC API is not meant for developing call center related applications. DMCC does not provide a generalized API to retrieve the Agent information for the Agent logged into a specific station extension. The TSAPI and JTAPI APIs are to be used for developing call centre related applications.

What licensing do I need to purchase to utilize DMCC (formally CMAPI) services?

To utilize the basic set of DMCC services, Communication Manager must be provisioned with an appropriate number of IP_API_A licenses. One IP_API_A license per registered device is required. A device may be registered as either shared control or exclusive control. In either case each device will consume one IP_API_A license. A few DMCC services (i.e. snapshot call, snapshot device and single step conference) also utilize a TSAPI Basic license for the duration of the transaction in the AE Server.

What could cause an error during a restore when using the OA&M back up and restore functionality for the AES database?

Under certain conditions (e.g. installing WinZip 8.0) MS Windows IExplorer will change the extension of the downloaded file; e.g. a file mvapdb21022007.tar.gz would be renamed to mvapdb21022007.tar . This causes the database import to behave erroneously and the database not to be updated properly.

This will affect all versions of AES that generate files with .tar.gz extensions when accessing OAM using certain windows configurations, and the extensions might be affected differently, depending on configuration of the windows box (e.g. Change extension to .zip, etc). The OAM administrator must ensure the file has the proper extension of .tar.gz prior to invoking the restore.

How can I troubleshoot why my DMCC Softphone application is not receiving RTP media?

Here are some methods to tell how far through setting up media your application is getting, and where to start troubleshooting.

In DMCC you should be creating start/stop media monitors. Events sent to these listeners will tell the application that CM has created media sockets to receive media on (start media), and that CM has begun sending media to the RTP addresses the application specified when it registered.

XXXX is the DMCC softphone extension in the following descriptions.

SAT command 'list trace station XXXX'
This command will monitor the high level call events occurring at a station. As the station is added to a call (e.g. via single step conference), you should see messages showing the codec, IP address and other RTP parameters for the connection displayed. If there is a problem with establishing media, in most cases you will see a 4 digit event code and an English description of the fault. This can be very helpful when troubleshooting media establishment issues.

SAT command 'status station XXXX'
Once a call is established (and you were not doing list trace), you can look at the RTP parameters for the call using the status station command at the SAT.

If you see no errors during list trace station, you can use a LAN sniffer to observe the RTP media leaving the medPro and headed for the application. At this layer you can see issues like the application did not open the socket CM is using (ICMP responses), the media is being sent to a socket other than what the application is expecting, and differences in codec selection.

Common mistakes are:

  1. No media processing resources available in the network region the application is registering in.
  2. Codec Mismatch between application and codec list specified for the network region the application is registering in.
  3. Application did not open socket for RTP address it specified when registering
  4. IP Routing issues between MedPro (or G700 gateway) and the application.

Dashboard.exe
In the .NET SDK there is a test application called the dashboard (due to the number of exposed controls). This application is a very useful learning tool regarding the XML exchange between the AES and application. It can be used to establish an RTP session between the tool and CM and the knowledge gained applied to the developer?s application.

simple call record .NET application
In the .NET SDK there is sample code for a simple call record application that goes through the necessary steps of opening up a RTP path between the application and the AES/CM. The .NET API provides logic in the .dll that handles much of the plumbing work for the application, so while helpful, it is not a complete answer for those using java or straight C/C++ code for their application.

The .NET SDK can be found by logging into the Avaya web portal and changing your URL to:
https://devconnect.avaya.com/public/dyn/d_dyn.jsp?fn=125

Look for the following link on that page:
Application Enablement Services IP Communications SDK (Device and Media Control/CMAPI) (.NET)

How does an application take the RTP stream (G.711U or G.729) from DMCC and create a wav file?

G.729 is a 10ms of voice which has been compressed eight times over a comparable G.711 stream. Thus every 10ms, ten bytes of encoded audio information is available. These 10 byte groups are called frames. There are SID (Silence InDication) frames/packets that are 2 bytes long that represent silence information. The RTP payload type does not change for G.729 SID packets, just the length. The length of the packet is the signal to the far end decoder that a SID packet is being sent/received. A RTP frame can contain one or more frames of G.729. There is no clear standard on where the SID packets can be sent. The typical implementation sends it as: 1) a stand-alone packet 2) the last frame in a sequence of frames in a single packet. To capture G.729, the application will need to do something along the following guidelines (assuming you are operating above the UDP layer 1) remove the 12 byte RTP header 2) attach your own header indicating the length of the data portion of the packet 3) Store the data 4) On playback send packets just the way you received them. To capture G.711, just remove the 12 byte RTP header and store the data. On playback send the data in the negotiated sized "frames" based on 80 bytes per 10 ms units. Remember to flow control the application output based on a real time clock so the far end decoder's jitter buffer does not overflow and no data is lost.

How can one verify that the Device, Media and Call Control (DMCC) service is running on an AE Services server?

To check the Services running on an AE Services server, follow the procedure described below:

  1. Open the AE Services server's OA&M web page at http://<IP address of AE Services Server>
  2. Click on AE Server Administration and login with the username craft or a customer created AE Services administrator login and password.
  3. Click on CTI OAM Administration to go to the CTI OAM Home page where the states of all the Services that are running on the AE Services server are shown. The information that is presented includes the status of the DMCC service.

Does Device, Media and Call Control (DMCC) require a link between Communication Manager and AE Services server?

Depending on which DMCC service is being used, DMCC may require a link to be configured between Communication Manager and AE Services server. For Call Information Services, a switch connection needs to be configured. Call Control Services and Feature Control/Information Services require a switch connection and a TSAPI link must be configured. The DMCC API is comprised of the Phone and Media services which provides first party Call Control services, the DAPI service which provides the Call and Link Information services (which require a switch connection) and TSAPI services (which require a TSAPI link) to provide third party call control capabilities, such as the ability to place calls, create conference calls, divert calls, reconnect calls, and monitor call control events.

Is there a way through the Device, Media and Call Control (DMCC) interface to get the number of a calling party?

Using the DMCC Call Control services, it is possible to retrieve the number of a calling party. The device answering the call (the destination) must be monitored with Call Control Monitor to look for the calling number information in the Delivered or Established events. Alternatively, the GetDisplay(Object)method can be used to get the information presently displayed on the phone. The application should parse the display information to extract the calling party's number. Note that in some call scenarios (e.g. a conference), the display information on the phone's display will not contain calling party information.

Which licenses do Device, Media and Call Control (DMCC) require on Avaya Communication Manager?

Starting with Communication Manager 5.0 coupled with AE Services server 4.2 a 'DMCC DMC' license from AE Services is used. For older releases, or if no 'DMCC DMC' license is available, an attempt will be made to allocate an 'IP_API_A' license from Communication Manager instead. One 'DMCC DMC' or 'IP_API_A' license is used for each registered DMCC station. If DMCC uses Call Control services or Feature Control/Information services, then a TSAPI basic license is also required. A DMCC device registration will consume an IP Station license on Communication Manager.

Is it possible to use the multiple registrations per device capability with an extension that has the EC500 feature enabled?

Yes. However there are limitations regarding the level of media support.

Using the multiple registrations per device capability requires the following:

  1. The IP Softphone flag on page 1 of the change station form must be enabled.
  2. Communication Manager must be running a minimum release of 5.0.
  3. AE Services must be running a minimum release of 4.1.
No additional configuration is required for an EC500 enabled extension to utilize the multiple registrations per device capability.

The IP Softphone flag appears on stations that have been configured with a station Type that uses the IP (H.323) or DCP protocol to interface to Communication Manager (e.g. Station Types of 46xx, 96xx, 24xx, 54xx, 64xx, etc).

Enabling the IP Softphone setting allows an Avaya IP Softphone or DMCC device registration to be used with the station.

When the registration utilizes client media mode so that Communication Manager replicates the media stream for the physical phone and sends it to the DMCC device, when the physical device is active on a call, the media stream is replicated. If the call is answered by the EC500 destination (e.g. a PSTN phone), the media is not replicated (as of release 5.2) and sent to the DMCC application.

Can an IP Softphone be monitored using DMCC?

Yes, beginning with AE Services Release 4.1 coupled with Communication Manager Release 4.0, the DMCC service supports monitoring and control of Avaya IP Softphone.

When is a DMCC_DMC license allocated and when is it not?

The DMCC_DMC license is acquired from the WebLM associated with AE Services. The DMCC_DMC is a replacement for an IP_API_A license. In order for a DMCC_DMC license to be allocated a few pre-conditions must be met, otherwise an IP_API_A license from Communication Manager will be allocated in its place (assuming that it is available).

  • The AE Services release must be 4.2.2 or more recent and Communication Manager's release must be 5.1 or more recent. Note that although the DMCC_DMC license capability is advertised as being available with the AE Services' 4.2 release, there was a bug preventing proper allocation (when used with Communication Manager Release 5.2) that was fixed in AE Services release 4.2.2.
  • There must be DMCC_DMC licenses available from the WebLM license server for AE Services.
  • The registration method must be registerTerminal (RegisterTerminalRequest if using XML). If the registerDevice method (a deprecated method) is used, then an IP_API_A license will always be allocated.
  • A 'switch connection' must be provisioned between AE Services and Communication Manager for the device undergoing registration. The switch connection link allows AE Services to inform Communication Manager that a DMCC_DMC license has been allocated. If a switch connection is not present (it must be seen in the contents of the deviceId, then AE Services defers the license allocation to Communication Manager, which will attempt to acquire an IP_API_A license. If the application is using IP addresses (for a CLAN or procr) when allocating deviceIds, then the H.323 Gatekeeper List must be configured in AE Services so an association between the IP Address and the appropriate switch connection.
  • The DeviceID specified in the RegisterTerminalRequest must include a switch-name that matches the 'switch connection'. This can be achieved by either:
    • specifying the matching switch-name (Connection Name) explicitly in the 'GetDeviceId' request, or
    • specifying a CLAN or procr (Processor Ethernet) IP address that is included in the "Edit PE/CLAN IPs" H.323 Gatekeeper's list provisioned for the appropriate 'switch connection'.
  • In the original version of this FAQ it stated, "The DMCC protocol version of DMCC API referred in StartApplicationSession interface must be http://www-ecma-international.org/standards/ecma-323/csta/ed3/priv3 (AE Services R4.2.2) or later." When testing with AE Services 6.1 it was found that when protocol version 3.0 was used AE Services allocated a DMCC_DMC license. This may have been true with AE Services 4.2 and greater, or it could be a change made to the product in some intermediate release that is not clearly documented.

Can a Vector Directory Number (VDN) be monitored using DMCC?

Monitoring of VDNs is official supported starting with release 6.0 of Applications Enablement Services. In prior releases of AE Services support for VDN monitors is officially not supported although some functionality may work with release 5.2.

What is the difference between a ‘DMCC Full’ license and a ‘DMCC Basic’ license?

DMCC leverages H.323 stations in Communication Manager. H.323 stations leverage basic station functionality. Thus to have a stand alone DMCC Device, you must have rights to use (RTU) for a station and an IP station. Additionally, you must have an RTU the DMCC service which is sold of increments of a registered DMCC device.

DMCC has a number of different modes (Main, Dependent, Independent). A Main mode device is a stand alone device. Dependent and Independent need a Main mode device to work. A Main mode device can either be a Main mode DMCC device, or a desk or soft phone.

Depending on which DMCC mode your device will register in, a different collection of licenses is needed.

For historical reasons two forms of DMCC licenses were made available to account for what licensing a customer may already have in place/available before the DMCC application was added to the environment. Some customers have a large number of VALUE_STA and VALUE_IP_STA available to them prior to the addition of a DMCC application into the environment. A DMCC Basic license can often meet their needs. For other customers who are utilizing their Station and IP Station licenses, they need to add to that capacity while deploying DMCC based applications.

DMCC Full provides entitlements for a CM Station (VALUE_STA) license/rtu and a DMCC (VALUE_AES_DMCC_DMC) license/rtu. DMCC Basic only provides the DMCC license/rtu component. Currently R8 CM provides an entitlement for 18,000 IP Stations. Historically a DMCC Full license has provided a VALUE_IP_STA license, but when IP Stations became an entitlement this binding was dropped.

  • A Main mode DMCC device (e.g. what is needed for Single Step Conference and Service Observing forms of call recording) will consume a CM station (VALUE_STA), an IP Station (VALUE_IP_STA) and a DMCC (VALUE_AES_DMCC_DMC) license/rtu.
  • A Multiple Registration form of a DMCC device, e.g., a call recorder (which will use DMCC Dependent or Independent mode), will consume an VALUE_IP_STA and a VALUE_AES_DMCC_DMC license/rtu but not a VALUE_STA license/rtu.

What licenses are required for DMCC based Call Recording solution?

A number of different licenses are required for the complete solution. There are three forms of recording solutions: Single Step Conference, Service Observing and Multiple Registration. Each has its own licensing requirements. Further, depending on the AE Services and Communication Manager Release there can be a difference in where the DMCC license resides (the AES: VALUE_DMCC_DMC or the CM: IP_API_A).

For a review of the different methods of performing call recording with Avaya Communication Manager and AE Services please see the developer guide Developing Client-side IP Call Recording Applications using Avaya Application Enablement Services available from the Devconnect portal. This document covers various advantages and limitations of the different recording designs. Familiarize yourself with this document and choose a design approach that meets your requirements.

Licenses

  • In all the described forms of call recording, a DMCC device is used as a recording device. DMCC devices used to record media must be registered. The act of registering a DMCC device consumes a DMCC license is required. See the section DMCC Licenses below for a full description of this license type.
  • Typically, the application monitors a target device (station) for calls using DMCC, TSAPI or JTAPI. In all of these cases, a TSAPI device monitor is used which consumes a TSAPI Basic User License (VALUE_TSAPI_USERS). TSAPI Basic User Licenses are managed by the AE Services' WebLM server. Monitoring the target is not strictly required but, without a monitor, the application will miss important information about the call (e.g. ANI, DNIS or redirecting information). For these reasons most call recording applications will consume a TSAPI Basic User license.
  • If Single Step Conference (SSC) is used to join a recorder into a call, the SSC request consumes a TSAPI Basic User License (VALUE_TSAPI_USERS) for the duration of the call. This license is in addition to the license used to monitor the target station. This extra license is not required for Service Observing or Multiple Registration.
  • A recording solution that utilizes the Single Step Conference or Service Observing methods will consume a station license (VALUE_STA; "Max Stations:" from page 1 of display system-parameters customer-options form) per recorder. The Multiple Registrations method does not consume a VALUE_STA license.

DMCC Licenses
The preferred license to use is a DMCC_DMC license (VALUE_DMCC_DMC) from the AE Services' WebLM server. However, in order to use a DMCC_DMC license, all of the following criteria must be met:

  • the Communication Manager release is 5.1 or later
  • the AE Services release is 4.2.2 or later
  • there is a provisioned switch connection between AE Services and the Communication Manager on which the device will be registered
  • the DeviceID in the RegisterTerminalRequest includes a switch-name that matches the switch connection
  • the WebLM server contains available VALUE_DMCC_DMC licenses
  • the DMCC protocol used in the StartApplicationSession request is http://www-ecma-international.org/standards/ecma-323/csta/ed3/priv3 (R4.2) or later
  • the registration method is RegisterTerminal and not the deprecated RegisterDevice

If all of these criteria are met, then, when a license is available from the WebLM server, a VALUE_DMCC_DMC license is used. Otherwise, an IP_API_A license on Communication Manager is required.

The DMCC License (either VALUE_DMCC_DMC or IP_API_A), the IP Station license (VALUE_IP_STA) and the Station license (VALUE_STA) are typically bundled together as a DMCC Full license. A DMCC Basic license is used by customers who have a large pool of existing unused VALUE_IP_STA and VALUE_STA licenses from which they can draw.

Application Enablement Protocol Licensing
For Communication Manager release 4.x and earlier, in order to access TSAPI services, the AE Services server requires an Applications Enablement Protocol (AEP) license (VALUE_AEC_CONNECTIONS) for each IP connection to Communication Manager. Avaya recommends a minimum of two AEP connections between each AE Services server and a specific Communication Manager. For AE Services release 5.2 or later, AEP connections are not licensed.

If the target customer site has large recording needs you may need additional AEP connections (VALUE_AEC_CONNECTIONS). The TSAPI monitors trigger ASAI events (messages) between Communication Manager and AE Services. A single AEP can handle 200 messages per second from AE Services to Communication Manager and 240 messages per second from Communication Manager to AE Services. The minimum number of ASAI messages per answered call is 5 (five). Various features and other activity will increase the traffic. The maximum traffic from Communication Manager to AE Services is 1000 messages per second. The maximum traffic from AE Services to Communication Manager is 1000 messages per second. Starting with AE Services 5.2 and Communication Manager 5.2, a processor Ethernet (procr) interface can be used for AE Services to Communication Manager AECs. This link is also limited to 1000 messages per second.

The purchase of an AE Services License 'gives' you two AEP licenses. Based on how much recording you want to do at a specific customer site, you need to calculate how many additional Station, IP Station, DMCC and TSAPI Basic User licenses will be needed to support that location.

It is recommended that the DMCC stations use their own C-LAN for registrations (separate from the C-LANs used for AEP connections and separate from the C-LANs used for regular IP Station registration). There are obvious reasons for this additional hardware on a large configuration, but for cost reasons some customers forgo those recommendations on smaller installs.

It is also recommended that the CLAN provisioned for a switch connection not be the same CLAN that is provisioned for the H.323 Gatekeeper (that used for H.323 device registrations by AE Services). This is again for performance reasons coupled with the AEP traffic limitations imposed by the CLAN.

If the application utilizes TSAPI Advanced User licenses (and there is no reason that a call recording application would necessarily need this functionality) AE Services further licenses the type of Communication Manager with which AE Services server uses Advanced TSAPI User functionality. Different Communication Manager types are classified into Small, Medium and Large. For additional details regarding when this license is needed and what switch types are associated with which classification, see the appropriate version of Avaya MultiVantage® Application Enablement Services Overview available from the Devconnect web portal.

Avaya Aura? Communication Manager System Capacities Table, available from the Avaya Support web portal, gives the recommended number of stations/C-LAN (max 400) and ASAI message traffic limitations for an AEP connection (given above).

Summary

License Type SO SSC Multiple Registrations
VALUE_STA Required Required Not Required
VALUE_IP_STA Required Required Required
IP_API_A or DMCC_DMC Required Required Required
VALUE_TSAPI_USERS Optional
(depending on the service utilized)
One license required for Single Step Conference with the option of a second
(depending on the service utilized)
Optional
(depending on the service utilized)
VALUE_AEC_CONNECTIONS Required for Communication Manager release 4.x and prior Required for Communication Manager release 4.x and prior Required for Communication Manager release 4.x and prior.


**Additional Information regarding call recording can be found in Developing Client-side Call Recording Applications using Application Enablement Services (PDF)

Example
A recorder with Communication Manager release 5.2 that requires ANI/DNIS information will use the following licenses per monitored station.

License Type SO SSC Multiple Registrations
VALUE_STA 1 1 0
VALUE_IP_STA 1 1 1
IP_API_A or DMCC_DMC 1 1 1
VALUE_TSAPI_USERS 1 2 1

What extra resources are required for a Custom Media Stream solution?

The Custom Media Stream feature was added in Avaya Aura 8.0.1.  It allows an application to record or monitor the audio from individual legs of a call separately, or any combination of call legs (parties) as a unique media stream.  You can get more information on this feature in any of the DMCC Programmers Guides.

The term recorder in this FAQ is meant to cover any application that is receiving a media stream be it for recording, analytics, or other purposes.

Licenses:

The licensing requirements for normal, Full Stream, recorders are described in the FAQ "What licenses are required for DMCC based Call Recording solution?".

While a Full Stream recorder uses one registered DMCC terminal, a Custom Media Stream recorder requires more than one DMCC terminal to be registered.  This has an impact on the number of licenses required.

Each registered DMCC multiple registrations terminal consumes one of each of the following licenses:

  • VALUE_IP_STA
  • DMCC_DMC (or IP_API_A if no DMCC_DMC is available)

Example
A Custom Media Stream recorder which receives two streams and requires ANI/DNIS information will use the following licenses per monitored station.

License Type SO SSC Multiple Registrations
VALUE_STA 1 1 0
VALUE_IP_STA 2 2 2
DMCC_DMC or IP_API_A 2 2 2
VALUE_AES_TSAPI_USERS 1 2 1

 

Media Resources:

As well as using extra licenses, the Custom Media Stream feature will impact the DSP requirements of the Media Server or Media Gateway.

Each Custom Media Stream will consume a logical DSP resource.   Each VoIP party in the call will also consume a DSP resource.  So, a call with two VoIP parties and two recorders will consume a total of four DSP resources.

The Custom Media Stream feature can be applied to calls containing up to Communication Manager’s maximum party count (as of release 8.1, six). If the recorder requires it, six media streams containing one talker in the call’s audio can be established and configured. In order to support high availability (redundancy), a total of twelve recorders can be added through multiple registrations on a single extension. Each recorder requires the licensing and the DSP resources outlined above.

Note:  Many two-party VoIP calls use direct IP media between the phones and, therefore, do not consume any Communication Manager DSP resources.  When a recorder is added to a direct IP media call, the call becomes anchored on the Media Server/Gateway and each VoIP party must consume a DSP resource.  Therefore, adding recorders to a direct IP media call will cause the number of DSP resources consumed to increase by two more than the number of recorders added.

Why am I getting poor audio quality playback when using server media mode?

There are numerous contributors to poor audio quality. One important setting in this context though is the frames-per-packet setting in the Communication Manager ip-codecs form. Make sure this setting is '2' (20 ms of audio per packet), for the ip-codec in use by the ip-network-region that the DMCC devices are registering into. The use of higher settings with AE Services server media mode causes slow, choppy audio playback. It may be necessary to segment the DMCC devices into their own ip-network-region if there are conflicts with other device's needs relative to the frames-per-packet setting.

How can near real-time split/skill status reports be generated from AE Services similar to CMS reports?

  • The DMCC SDK in conjunction with vustats-display-format feature of Avaya Communication Manager can be used to get the basic real-time status information for a split/skill.
    Note: The information received from vustats-display is not as comprehensive as what is received from CMS real-time Data Interface. In some cases the data can be accumulated and evaluated over time to produce similar statistics.
  • Configure a vustats-display-format (at the SAT use the 'change vustats-display-format <1>' command) for skill object and data types (agents-staffed, calls-waiting, etc.) you want to display in near real-time.
    Note: the Format Description field has limited length. If you want to display a number of data-types, you may need to configure multiple vustats-display-formats each with a sub-set of intended data-types. Display formats can be 'chained' together using the "Next Format Number" field of the vustats-display-format so that each button push displays the next format in order.
  • Create a station ('add station next' command at Communication Manager's SAT) in Communication Manager, enable IP Softphone, provide a security code and assign a button vu-display feature with required format and Skill ID.
  • Make sure that the provisioned extension(s) are monitored in AE Services and the CTI User has access to the Security Database device group containing the extension (or the CTI user has unrestricted access).
  • Get a DeviceID for the extension.
  • Create Phone Monitor for the device from the application in order to receive display updates.
  • Register the corresponding extension using DMCC SDK. An example for this is included in the sample application available with SDK (Tutorial.java). A main mode, no-media registration is necessary.
  • Use the getButtonInfo(0) to discover the buttons provisioned on the extension and locate the vu-stats button.
  • Once the station is registered, use ButtonPress Class (ch.ecma.csta.binding.ButtonPress) and its pressButton() method to press the button corresponding to vu-stats feature.
  • Once the button is pressed, use GetDisplay Class (ch.ecma.csta.binding.GetDisplay) and its getDisplay() method to get a snapshot of display information on the device, or utilize the display events.
  • Use the getDisplayList() method of GetDisplayResponse class to iterate through the display and extract the current vu-stats information.
  • Put the program in loop after appropriate interval to repeat these operations to get the real-time stats.
  • If you want to display more data-types and need to use multiple vustats-display-formats, you can either assign different formats to the different buttons of a station and then press buttons sequentially followed by display snapshot OR chain the formats using the 'Next Format Number" field OR you can assign different skills and formats to different stations, register them all with DMCC, and use multiple deviceIDs to get information for each station.

Can Communication Manager handle RTP media being sent to/received from different ports?

Due to security reasons, most Communication Manager Gateways default configurations will prevent RTP media originating from one port and being sent to a different port for a single connection. If a device does so, the result could be one way audio. Thus, best practice is that DMCC client media applications should use the same RTP port to transmit RTP and receive RTP to/from Communication Manager gateways and endpoints.

How can an application add a recorder into a call without the call participants being aware of it?

Many applications want to provide a call recording service, but do not let the parties in the call know that recording is being done. With Single Step Conference though the CTI interface the application can request 'silent mode'. This adds the recording party to the call, but does not change the display information at other parties in the call.

In general a Service observer is added silently (no display updates) with the exception of Service Observing tones.

A multiple registrations per device recorder is always added silently. It is the least detectable of all the recording strategies provided through AE Services.

Silent mode and no-talk techniques have the advantage of reducing timeslot consumption for the call, allowing more call recording capacity out of a gateway or chassis.

Why can't I use DMCC to Single Step Conference (SSC) a party to a call for recording from an Attendant without getting a silent RTP stream?

To record calls where an Attendant is a party on the call use the Service Observing method for call recording. This has been tested and a normal RTP stream is received.

Here is an explanation on why SSC will not work.

First a simple call flow to explain the scenario:

32105: The originating party from an on-pbx H323 phone

32111: one -X Attendant in telecommuter mode with 32106 as the telecommuter "telephone at" destination.

32106: H323 phone as "Telephone At" extension for one-X Attendant extension 32111

32107: Registered DMCC station (DMCC dashboard) in main mode with client media control

  1. Place TSAPI monitor on 32106
  2. 32105 calls 32111
  3. Phone call is alerting on 32111 and 32106
  4. Answer the calls alerting at on 32111 and 32106
  5. Get the call ID on 32106 from the DELIVERED (or ESTABLISHED) event from the TSAPI device monitor on x32106
  6. Send a SSC request with call ID from step 5, and extension 32106 as the controlling extension to add device 32107 (the DMCC call recording device) to the call.
  7. The SSC is successful
  8. Check the media stream being sent to the DMCC extension 32107 and it is all hex f's, which indicates silence.

What is going on 'behind the scenes' are actually two separate calls in Communication Manager. One call is between the calling party and the attendant. The second call is hidden from the users, but represents the audio path between the attendant and the telecommuter extension; in Communication Manager this class of connections is referred to as a "Service Link". Both the telecommuter extension AND the attendant extension have to be answered for the telecommuter extension to have a media path to the calling party. This service link carries the audio path from the first call to the telecommuter extension. While it is convenient to refer to this connection as a 'call' bear in mind that it is special form of connection to communication manger in many ways and that uniqueness of this type of connection is why there are some services that work with it and some that do not, and some sort of work, but don't completely.

In order to demonstrate that there are two different calls, run two tests. In the first test, run the above scenario, and do a snapshotDevice on the originating station (x32105) and on the telecommuter extension (x32106). Note that there are two different callIDs. The callID given from the snapshot of the originating station (x32105) would be the callID received if monitoring of attendants was allowed in TSAPI. This would be the callID to use for the recorder to get the proper RTP stream. The service link callID (from monitoring the telecommuter extension) is going to give you problematic RTP. The state of this 'call' is such that it is only ever expected to have one party in it along with a logical relationship with the caller/attendant call and doing SSC into this call violates that assumption the code is making.

In the second test replace one-x Attendant with Avaya IP Softphone. A TSAPI monitor can be place on IP Softphone to get the callID for the call as it exists at the destination party (as opposed to the originator?if the call is contained on a single Communication Manager these two callIDs will be the same). Here is the scenario replacing the attendant with IP Softphone:

32105: The originating party from an on-pbx H323 phone

32108: Avaya IP Softphone in telecommuter mode with 32106 as the telecommuter "telephone at" destination.

32106: H323 phone as "Telephone At" extension for 32108

32107: Registered DMCC station (DMCC dashboard) in main mode with client media control:

  1. Place TSAPI monitor on 32108
  2. Place TSAPI monitor on 32106
  3. 32105 calls 32108
  4. Get the call ID on 32108 from the DELIVERED or event in TSAPI
  5. Phone call is alerting on 32106
  6. Answer call on 32106
  7. Get the call ID on 32106 from the DELIVERED/ESTABLISHEd event in TSAPI
  8. Send a SSC request with call ID from step d, and extension 32106 as the controlling extension to add device 32107 (the DMCC call recording device) to the call.
  9. The SSC is successful
  10. Normal RTP media stream is received at extension 32107

The Avaya IP Softphone acts as a normal H323 telephone and can be monitored by TSAPI (Step a). So a monitor on the extension associated with IP Softphone gets a call ID (Step d). A monitor on the 'telephone At' extension as well shows a different call ID for the service link connection (Step g). Using the call ID from the Avaya IP Softphone extension (Step d) to complete SSC gets an RTP path that is not constant, and therefore not silent. However, SSC with the service link callID creates the problem that you see.

Since TSAPI cannot be used to monitor an attendant (or IP Attendant) extension, the application will be unable to get the callID for the 'right' call. Since services like SSC, Service Observing and Multiple Registrations per Device are not supported on 'Service Links' the behavior you are seeing is due to unsupported feature interactions.

With all of that said, the issue here is that since TSAPI does not support Attendants then SSC is not going to work to record calls as you won't have the correct call ID to add the recorder to the call.

Are there any limitations when using DMCC Multiple Registrations to collect call media?

Yes. There is an interaction between Selective Listen Hold (also referred to as Selective Listen Disconnect) and DMCC Multiple Registrations where a Multiple Registrations (MR) device may be impacted. Call media is defined to be - A generic reference to any one of, or a collection of voice (RTP) traffic, in-band tones, DTMF via RTP payload type indications, and out-of-band DTMF events for calls. An application may utilize one or more of these types of call media. The use of the Selective Listening Hold service by any deployed application to modify the call media to a specific extension will impact the call media received by any/all other applications using the Multiple Registrations feature that are associated with the same extension.

Application developers who are working with call media, should review the DevConnect White Paper Recommended Guidance for DMCC Applications Utilizing Call Media for further information.

Can the use of Selective Listen Hold adversely impact other applications?

Yes. Selective Listen Hold (SLH) – also referred to as Selective Listen Disconnect prevents the flow of voice and DTMF signals coming from a specified party from being received by another party in the call. In many cases blocking that pathway is valuable functionality. One use of this functionality is with a PCI-DSS application to prevent a contact center agent from hearing personal information supplied by a party in the call. However, if a recording device that is using DMCC Multiple Registrations, is receiving the agent’s audio stream, when SLH is invoked to reconfigure the audio the agent is receiving, the recording application is also blocked from receiving the voice and DTMF information sent by the far end party. This may or may not be expected by the call recording application. Another case could be that the Multiple Registrations application is doing analytics on the audio stream; when SLH is invoked, the analytics application may be prevented from receiving audio or DTMF events unexpectedly and undesirably. Since multiple applications may be involved at a customer site (one using Multiple Registrations and the other invoking SLH), but both operating on the same party in the call, the MR application may be unaware that the interaction is occurring. If being in full control of the receipt of all of the audio and DTMF information occurring in a call is required by the application, then the guidance in the DevConnect White Paper Recommended Guidance for DMCC Applications Utilizing Call Media should be reviewed for further information about how to properly design for this interaction.