expand ansible-doc coverage (#74963)
* Expand ansible-doc to tests/filters and fix existing issues enable filter/test docs if in single file or companion yaml add docs for several filters/tests plugins allow .yml companion for docs for other plugins, must be colocated verify plugins are valid (not modules, cannot) fix 'per collection' filtering limit old style deprecation (_ prefix) to builtin/legacy start move to pathlib for saner path handling moved some funcitons, kept backwards compat shims with deprecation notice Co-authored-by: Abhijeet Kasurde <akasurde@redhat.com> Co-authored-by: Felix Fontein <felix@fontein.de> Co-authored-by: Sandra McCann <samccann@redhat.com>
This commit is contained in:
parent
a65fbfad5b
commit
b439e41a91
|
@ -24,6 +24,9 @@ recursive-include lib/ansible/module_utils/powershell *.psm1
|
|||
recursive-include lib/ansible/modules/windows *.ps1
|
||||
recursive-include lib/ansible/galaxy/data *.yml *.j2 README.md ansible.cfg inventory .git_keep
|
||||
recursive-include lib/ansible/config *.yml
|
||||
recursive-include lib/ansible/modules *.yml
|
||||
recursive-include lib/ansible/plugins/test *.yml
|
||||
recursive-include lib/ansible/plugins/filter *.yml
|
||||
recursive-include licenses *.txt
|
||||
recursive-include packaging *
|
||||
recursive-include test/ansible_test *.py Makefile
|
||||
|
|
|
@ -21,7 +21,29 @@ You can add a custom filter plugin by dropping it into a ``filter_plugins`` dire
|
|||
Using filter plugins
|
||||
--------------------
|
||||
|
||||
For information on using filter plugins, see :ref:`playbooks_filters`.
|
||||
You can use filters anywhere you can use templating in Ansible: in a play, in variables file, or in a Jinja2 template for the :ref:`template <template_module>` module. For more information on using filter plugins, see :ref:`playbooks_filters`. Filters can return any type of data, but if you want to always return a boolean (``True`` or ``False``) you should be looking at a test instead.
|
||||
|
||||
.. code-block:: YAML+Jinja
|
||||
|
||||
vars:
|
||||
yaml_string: "{{ some_variable|to_yaml }}"
|
||||
|
||||
Filters are the preferred way to manipulate data in Ansible, you can identify a filter because it is normally preceded by a ``|``, with the expression on the left of it being the first input of the filter. Additional parameters may be passed into the filter itself as you would to most programming functions. These parameters can be either ``positional`` (passed in order) or ``named`` (passed as key=value pairs). When passing both types, positional arguments should go first.
|
||||
|
||||
.. code-block:: YAML+Jinja
|
||||
|
||||
passing_positional: {{ (x == 32) | ternary('x is 32', 'x is not 32') }}
|
||||
passing_extra_named_parameters: {{ some_variable | to_yaml(indent=8, width=1337) }}
|
||||
passing_both: {{ some_variable| ternary('true value', 'false value', none_val='NULL') }}
|
||||
|
||||
In the documentation, filters will always have a C(_input) option that corresponds to the expression to the left of c(|). A C(positional:) field in the documentation will show which options are positional and in which order they are required.
|
||||
|
||||
|
||||
Plugin list
|
||||
-----------
|
||||
|
||||
You can use ``ansible-doc -t filter -l`` to see the list of available plugins. Use ``ansible-doc -t filter <plugin name>`` to see specific documents and examples.
|
||||
|
||||
|
||||
.. seealso::
|
||||
|
||||
|
|
|
@ -9,7 +9,6 @@ Test plugins
|
|||
|
||||
Test plugins evaluate template expressions and return True or False. With test plugins you can create :ref:`conditionals <playbooks_conditionals>` to implement the logic of your tasks, blocks, plays, playbooks, and roles. Ansible uses the `standard tests `_ shipped as part of Jinja, and adds some specialized test plugins. You can :ref:`create custom Ansible test plugins <developing_test_plugins>`.
|
||||
|
||||
.. _standard tests: https://jinja.palletsprojects.com/en/latest/templates/#builtin-tests
|
||||
|
||||
.. _enabling_test:
|
||||
|
||||
|
@ -24,7 +23,63 @@ You can add a custom test plugin by dropping it into a ``test_plugins`` director
|
|||
Using test plugins
|
||||
-------------------
|
||||
|
||||
The User Guide offers detailed documentation on :ref:`using test plugins <playbooks_tests>`.
|
||||
You can use tests anywhere you can use templating in Ansible: in a play, in variables file, or in a Jinja2 template for the :ref:`template <template_module>` module. For more information on using test plugins, see :ref:`playbooks_tests`.
|
||||
|
||||
Tests always return ``True`` or ``False``, they are always a boolean, if you need a different return type, you should be looking at filters.
|
||||
|
||||
You can recognize test plugins by the use of the ``is`` statement in a template, they can also be used as part of the ``select`` family of filters.
|
||||
|
||||
.. code-block:: YAML+Jinja
|
||||
|
||||
vars:
|
||||
is_ready: '{{ task_result is success }}'
|
||||
|
||||
tasks:
|
||||
- name: conditionals are always in 'template' context
|
||||
action: dostuff
|
||||
when: task_result is failed
|
||||
|
||||
Tests will always have an ``_input`` and this is normally what is on the left side of ``is``. Tests can also take additional parameters as you would to most programming functions. These parameters can be either ``positional`` (passed in order) or ``named`` (passed as key=value pairs). When passing both types, positional arguments should go first.
|
||||
|
||||
.. code-block:: YAML+Jinja
|
||||
|
||||
tasks:
|
||||
- name: pass positional parameter to match test
|
||||
action: dostuff
|
||||
when: myurl is match("https://example.com/users/.*/resources")
|
||||
|
||||
- name: pass named parameter to truthy test
|
||||
action: dostuff
|
||||
when: myvariable is truthy(convert_bool=True)
|
||||
|
||||
- name: pass both types to 'version' test
|
||||
action: dostuff
|
||||
when: sample_semver_var is version('2.0.0-rc.1+build.123', 'lt', version_type='semver')
|
||||
|
||||
|
||||
Using test plugins with lists
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
As mentioned above, one way to use tests is with the ``select`` familiy of filters (``select``, ``reject``, ``selectattr``, ``rejectattr``).
|
||||
|
||||
.. code-block:: YAML+Jinja
|
||||
|
||||
# give me only defined variables from a list of variables, using 'defined' test
|
||||
good_vars: "{{ all_vars|select('defined') }}"
|
||||
|
||||
# this uses the 'equalto' test to filter out non 'fixed' type of addresses from a list
|
||||
only_fixed_addresses: "{{ all_addresses|selectattr('type', 'equalsto', 'fixed') }}"
|
||||
|
||||
# this does the opposite of the previous one
|
||||
only_fixed_addresses: "{{ all_addresses|rejectattr('type', 'equalsto', 'fixed') }}"
|
||||
|
||||
|
||||
Plugin list
|
||||
-----------
|
||||
|
||||
You can use ``ansible-doc -t filter -l`` to see the list of available plugins. Use ``ansible-doc -t filter <plugin name>`` to see specific documents and examples.
|
||||
|
||||
|
||||
|
||||
.. seealso::
|
||||
|
||||
|
@ -36,8 +91,8 @@ The User Guide offers detailed documentation on :ref:`using test plugins <playbo
|
|||
Using conditional statements
|
||||
:ref:`filter_plugins`
|
||||
Filter plugins
|
||||
:ref:`playbooks_filters`
|
||||
Using filters
|
||||
:ref:`playbooks_tests`
|
||||
Using tests
|
||||
:ref:`lookup_plugins`
|
||||
Lookup plugins
|
||||
`User Mailing List <https://groups.google.com/group/ansible-devel>`_
|
||||
|
|
|
@ -18,12 +18,11 @@ import re
|
|||
import textwrap
|
||||
import traceback
|
||||
|
||||
from collections.abc import Sequence
|
||||
|
||||
import yaml
|
||||
|
||||
import ansible.plugins.loader as plugin_loader
|
||||
|
||||
from collections.abc import Sequence
|
||||
from pathlib import Path
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible import context
|
||||
from ansible.cli.arguments import option_helpers as opt_help
|
||||
|
@ -37,12 +36,12 @@ from ansible.module_utils.six import string_types
|
|||
from ansible.parsing.plugin_docs import read_docstub
|
||||
from ansible.parsing.utils.yaml import from_yaml
|
||||
from ansible.parsing.yaml.dumper import AnsibleDumper
|
||||
from ansible.plugins.list import list_plugins
|
||||
from ansible.plugins.loader import action_loader, fragment_loader
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig, AnsibleCollectionRef
|
||||
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.plugin_docs import (
|
||||
REJECTLIST,
|
||||
get_docstring,
|
||||
get_versioned_doclink,
|
||||
)
|
||||
|
@ -56,6 +55,11 @@ PB_LOADED = {}
|
|||
SNIPPETS = ['inventory', 'lookup', 'module']
|
||||
|
||||
|
||||
def add_collection_plugins(plugin_list, plugin_type, coll_filter=None):
|
||||
display.deprecated("add_collection_plugins method, use ansible.plugins.list functions instead.", version='2.17')
|
||||
plugin_list.update(list_plugins(plugin_type, coll_filter))
|
||||
|
||||
|
||||
def jdump(text):
|
||||
try:
|
||||
display.display(json.dumps(text, cls=AnsibleJSONEncoder, sort_keys=True, indent=4))
|
||||
|
@ -64,17 +68,6 @@ def jdump(text):
|
|||
raise AnsibleError('We could not convert all the documentation into JSON as there was a conversion issue: %s' % to_native(e))
|
||||
|
||||
|
||||
def add_collection_plugins(plugin_list, plugin_type, coll_filter=None):
|
||||
|
||||
# TODO: take into account runtime.yml once implemented
|
||||
b_colldirs = list_collection_dirs(coll_filter=coll_filter)
|
||||
for b_path in b_colldirs:
|
||||
path = to_text(b_path, errors='surrogate_or_strict')
|
||||
collname = _get_collection_name_from_path(b_path)
|
||||
ptype = C.COLLECTION_PTYPE_COMPAT.get(plugin_type, plugin_type)
|
||||
plugin_list.update(DocCLI.find_plugins(os.path.join(path, 'plugins', ptype), False, plugin_type, collection=collname))
|
||||
|
||||
|
||||
class PluginNotFound(Exception):
|
||||
pass
|
||||
|
||||
|
@ -394,6 +387,11 @@ class DocCLI(CLI, RoleMixin):
|
|||
super(DocCLI, self).__init__(args)
|
||||
self.plugin_list = set()
|
||||
|
||||
@classmethod
|
||||
def find_plugins(cls, path, internal, plugin_type, coll_filter=None):
|
||||
display.deprecated("find_plugins method as it is incomplete/incorrect. use ansible.plugins.list functions instead.", version='2.17')
|
||||
return list_plugins(plugin_type, coll_filter, [path]).keys()
|
||||
|
||||
@classmethod
|
||||
def tty_ify(cls, text):
|
||||
|
||||
|
@ -477,33 +475,44 @@ class DocCLI(CLI, RoleMixin):
|
|||
def display_plugin_list(self, results):
|
||||
|
||||
# format for user
|
||||
displace = max(len(x) for x in self.plugin_list)
|
||||
displace = max(len(x) for x in results.keys())
|
||||
linelimit = display.columns - displace - 5
|
||||
text = []
|
||||
deprecated = []
|
||||
|
||||
# format display per option
|
||||
if context.CLIARGS['list_files']:
|
||||
# list plugin file names
|
||||
for plugin in results.keys():
|
||||
filename = results[plugin]
|
||||
text.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(filename), filename))
|
||||
for plugin in sorted(results.keys()):
|
||||
filename = to_native(results[plugin])
|
||||
|
||||
# handle deprecated for builtin/legacy
|
||||
pbreak = plugin.split('.')
|
||||
if pbreak[-1].startswith('_') and pbreak[0] == 'ansible' and pbreak[1] in ('builtin', 'legacy'):
|
||||
pbreak[-1] = pbreak[-1][1:]
|
||||
plugin = '.'.join(pbreak)
|
||||
deprecated.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(filename), filename))
|
||||
else:
|
||||
text.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(filename), filename))
|
||||
else:
|
||||
# list plugin names and short desc
|
||||
deprecated = []
|
||||
for plugin in results.keys():
|
||||
for plugin in sorted(results.keys()):
|
||||
desc = DocCLI.tty_ify(results[plugin])
|
||||
|
||||
if len(desc) > linelimit:
|
||||
desc = desc[:linelimit] + '...'
|
||||
|
||||
if plugin.startswith('_'): # Handle deprecated # TODO: add mark for deprecated collection plugins
|
||||
deprecated.append("%-*s %-*.*s" % (displace, plugin[1:], linelimit, len(desc), desc))
|
||||
pbreak = plugin.split('.')
|
||||
if pbreak[-1].startswith('_'): # Handle deprecated # TODO: add mark for deprecated collection plugins
|
||||
pbreak[-1] = pbreak[-1][1:]
|
||||
plugin = '.'.join(pbreak)
|
||||
deprecated.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(desc), desc))
|
||||
else:
|
||||
text.append("%-*s %-*.*s" % (displace, plugin, linelimit, len(desc), desc))
|
||||
|
||||
if len(deprecated) > 0:
|
||||
text.append("\nDEPRECATED:")
|
||||
text.extend(deprecated)
|
||||
if len(deprecated) > 0:
|
||||
text.append("\nDEPRECATED:")
|
||||
text.extend(deprecated)
|
||||
|
||||
# display results
|
||||
DocCLI.pager("\n".join(text))
|
||||
|
@ -635,27 +644,24 @@ class DocCLI(CLI, RoleMixin):
|
|||
def _list_plugins(self, plugin_type, content):
|
||||
|
||||
results = {}
|
||||
self.plugins = {}
|
||||
loader = DocCLI._prep_loader(plugin_type)
|
||||
|
||||
coll_filter = self._get_collection_filter()
|
||||
if coll_filter in ('ansible.builtin', 'ansible.legacy', '', None):
|
||||
paths = loader._get_paths_with_context()
|
||||
for path_context in paths:
|
||||
self.plugin_list.update(DocCLI.find_plugins(path_context.path, path_context.internal, plugin_type))
|
||||
|
||||
add_collection_plugins(self.plugin_list, plugin_type, coll_filter=coll_filter)
|
||||
self.plugins.update(list_plugins(plugin_type, coll_filter, context.CLIARGS['module_path']))
|
||||
|
||||
# get appropriate content depending on option
|
||||
if content == 'dir':
|
||||
results = self._get_plugin_list_descriptions(loader)
|
||||
elif content == 'files':
|
||||
results = self._get_plugin_list_filenames(loader)
|
||||
results = {k: self.plugins[k][0] for k in self.plugins.keys()}
|
||||
else:
|
||||
results = {k: {} for k in self.plugin_list}
|
||||
results = {k: {} for k in self.plugins.keys()}
|
||||
self.plugin_list = set() # reset for next iteration
|
||||
|
||||
return results
|
||||
|
||||
def _get_plugins_docs(self, plugin_type, names, fail_on_errors=True):
|
||||
def _get_plugins_docs(self, plugin_type, names, fail_ok=False, fail_on_errors=True):
|
||||
|
||||
loader = DocCLI._prep_loader(plugin_type)
|
||||
search_paths = DocCLI.print_paths(loader)
|
||||
|
@ -663,6 +669,7 @@ class DocCLI(CLI, RoleMixin):
|
|||
# get the docs for plugins in the command line list
|
||||
plugin_docs = {}
|
||||
for plugin in names:
|
||||
doc = {}
|
||||
try:
|
||||
doc, plainexamples, returndocs, metadata = DocCLI._get_plugin_doc(plugin, plugin_type, loader, search_paths)
|
||||
except PluginNotFound:
|
||||
|
@ -675,9 +682,11 @@ class DocCLI(CLI, RoleMixin):
|
|||
}
|
||||
continue
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError("%s %s missing documentation (or could not parse"
|
||||
" documentation): %s\n" %
|
||||
(plugin_type, plugin, to_native(e)))
|
||||
msg = "%s %s missing documentation (or could not parse documentation): %s\n" % (plugin_type, plugin, to_native(e))
|
||||
if fail_ok:
|
||||
display.warning(msg)
|
||||
else:
|
||||
raise AnsibleError(msg)
|
||||
|
||||
if not doc:
|
||||
# The doc section existed but was empty
|
||||
|
@ -776,8 +785,9 @@ class DocCLI(CLI, RoleMixin):
|
|||
docs['all'][ptype] = DocCLI._get_keywords_docs(names.keys())
|
||||
else:
|
||||
plugin_names = self._list_plugins(ptype, None)
|
||||
docs['all'][ptype] = self._get_plugins_docs(
|
||||
ptype, plugin_names, fail_on_errors=not context.CLIARGS['no_fail_on_errors'])
|
||||
# TODO: remove exception for test/filter once all core ones are documented
|
||||
docs['all'][ptype] = self._get_plugins_docs(ptype, plugin_names, fail_ok=(ptype in ('test', 'filter')),
|
||||
fail_on_errors=not context.CLIARGS['no_fail_on_errors'])
|
||||
# reset list after each type to avoid polution
|
||||
elif listing:
|
||||
if plugin_type == 'keyword':
|
||||
|
@ -846,6 +856,43 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
return 0
|
||||
|
||||
@staticmethod
|
||||
def get_all_plugins_of_type(plugin_type):
|
||||
loader = getattr(plugin_loader, '%s_loader' % plugin_type)
|
||||
paths = loader._get_paths_with_context()
|
||||
plugins = {}
|
||||
for path_context in paths:
|
||||
plugins.update(list_plugins(plugin_type, searc_path=context.CLIARGS['module_path']))
|
||||
return sorted(plugins.keys())
|
||||
|
||||
@staticmethod
|
||||
def get_plugin_metadata(plugin_type, plugin_name):
|
||||
# if the plugin lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
|
||||
loader = getattr(plugin_loader, '%s_loader' % plugin_type)
|
||||
result = loader.find_plugin_with_context(plugin_name, mod_type='.py', ignore_deprecated=True, check_aliases=True)
|
||||
if not result.resolved:
|
||||
raise AnsibleError("unable to load {0} plugin named {1} ".format(plugin_type, plugin_name))
|
||||
filename = result.plugin_resolved_path
|
||||
collection_name = result.plugin_resolved_collection
|
||||
|
||||
try:
|
||||
doc, __, __, __ = get_docstring(filename, fragment_loader, verbose=(context.CLIARGS['verbosity'] > 0),
|
||||
collection_name=collection_name, plugin_type=plugin_type)
|
||||
except Exception:
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError("%s %s at %s has a documentation formatting error or is missing documentation." % (plugin_type, plugin_name, filename))
|
||||
|
||||
if doc is None:
|
||||
# Removed plugins don't have any documentation
|
||||
return None
|
||||
|
||||
return dict(
|
||||
name=plugin_name,
|
||||
namespace=DocCLI.namespace_from_plugin_filepath(filename, plugin_name, loader.package_path),
|
||||
description=doc.get('short_description', "UNKNOWN"),
|
||||
version_added=doc.get('version_added', "UNKNOWN")
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def namespace_from_plugin_filepath(filepath, plugin_name, basedir):
|
||||
if not basedir.endswith('/'):
|
||||
|
@ -862,15 +909,20 @@ class DocCLI(CLI, RoleMixin):
|
|||
@staticmethod
|
||||
def _get_plugin_doc(plugin, plugin_type, loader, search_paths):
|
||||
# if the plugin lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
|
||||
result = loader.find_plugin_with_context(plugin, mod_type='.py', ignore_deprecated=True, check_aliases=True)
|
||||
if not result.resolved:
|
||||
raise PluginNotFound('%s was not found in %s' % (plugin, search_paths))
|
||||
for ext in C.DOC_EXTENSIONS:
|
||||
result = loader.find_plugin_with_context(plugin, mod_type=ext, ignore_deprecated=True, check_aliases=True)
|
||||
if result.resolved:
|
||||
break
|
||||
else:
|
||||
if not result.resolved:
|
||||
raise PluginNotFound('%s was not found in %s' % (plugin, search_paths))
|
||||
|
||||
filename = result.plugin_resolved_path
|
||||
collection_name = result.plugin_resolved_collection
|
||||
|
||||
doc, plainexamples, returndocs, metadata = get_docstring(
|
||||
filename, fragment_loader, verbose=(context.CLIARGS['verbosity'] > 0),
|
||||
collection_name=collection_name, is_module=(plugin_type == 'module'))
|
||||
collection_name=collection_name, plugin_type=plugin_type)
|
||||
|
||||
# If the plugin existed but did not have a DOCUMENTATION element and was not removed, it's an error
|
||||
if doc is None:
|
||||
|
@ -936,67 +988,33 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
return text
|
||||
|
||||
@staticmethod
|
||||
def find_plugins(path, internal, ptype, collection=None):
|
||||
# if internal, collection could be set to `ansible.builtin`
|
||||
|
||||
display.vvvv("Searching %s for plugins" % path)
|
||||
|
||||
plugin_list = set()
|
||||
|
||||
if not os.path.exists(path):
|
||||
display.vvvv("%s does not exist" % path)
|
||||
return plugin_list
|
||||
|
||||
if not os.path.isdir(path):
|
||||
display.vvvv("%s is not a directory" % path)
|
||||
return plugin_list
|
||||
|
||||
bkey = ptype.upper()
|
||||
for plugin in os.listdir(path):
|
||||
display.vvvv("Found %s" % plugin)
|
||||
full_path = '/'.join([path, plugin])
|
||||
|
||||
if plugin.startswith('.'):
|
||||
continue
|
||||
elif os.path.isdir(full_path):
|
||||
continue
|
||||
elif any(plugin.endswith(x) for x in C.REJECT_EXTS):
|
||||
continue
|
||||
elif plugin.startswith('__'):
|
||||
continue
|
||||
elif plugin in C.IGNORE_FILES:
|
||||
continue
|
||||
elif plugin .startswith('_'):
|
||||
if os.path.islink(full_path): # avoids aliases
|
||||
continue
|
||||
|
||||
plugin = os.path.splitext(plugin)[0] # removes the extension
|
||||
plugin = plugin.lstrip('_') # remove underscore from deprecated plugins
|
||||
|
||||
if plugin not in REJECTLIST.get(bkey, ()):
|
||||
|
||||
if collection:
|
||||
plugin = '%s.%s' % (collection, plugin)
|
||||
|
||||
plugin_list.add(plugin)
|
||||
display.vvvv("Added %s" % plugin)
|
||||
|
||||
return plugin_list
|
||||
|
||||
def _get_plugin_list_descriptions(self, loader):
|
||||
|
||||
descs = {}
|
||||
plugins = self._get_plugin_list_filenames(loader)
|
||||
for plugin in plugins.keys():
|
||||
|
||||
filename = plugins[plugin]
|
||||
|
||||
for plugin in self.plugins.keys():
|
||||
doc = None
|
||||
filename = Path(to_native(self.plugins[plugin][0]))
|
||||
docerror = None
|
||||
try:
|
||||
doc = read_docstub(filename)
|
||||
except Exception:
|
||||
display.warning("%s has a documentation formatting error" % plugin)
|
||||
except Exception as e:
|
||||
docerror = e
|
||||
|
||||
# plugin file was empty or had error, lets try other options
|
||||
if doc is None:
|
||||
# handle test/filters that are in file with diff name
|
||||
base = plugin.split('.')[-1]
|
||||
basefile = filename.with_stem(base)
|
||||
for extension in ('.py', '.yml', '.yaml'): # TODO: constant?
|
||||
docfile = basefile.with_suffix(extension)
|
||||
try:
|
||||
if docfile.exists():
|
||||
doc = read_docstub(docfile)
|
||||
except Exception as e:
|
||||
docerror = e
|
||||
|
||||
if docerror:
|
||||
display.warning("%s has a documentation formatting error: %s" % (plugin, docerror))
|
||||
continue
|
||||
|
||||
if not doc or not isinstance(doc, dict):
|
||||
|
@ -1008,29 +1026,6 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
return descs
|
||||
|
||||
def _get_plugin_list_filenames(self, loader):
|
||||
pfiles = {}
|
||||
for plugin in sorted(self.plugin_list):
|
||||
|
||||
try:
|
||||
# if the module lives in a non-python file (eg, win_X.ps1), require the corresponding python file for docs
|
||||
filename = loader.find_plugin(plugin, mod_type='.py', ignore_deprecated=True, check_aliases=True)
|
||||
|
||||
if filename is None:
|
||||
continue
|
||||
if filename.endswith(".ps1"):
|
||||
continue
|
||||
if os.path.isdir(filename):
|
||||
continue
|
||||
|
||||
pfiles[plugin] = filename
|
||||
|
||||
except Exception as e:
|
||||
display.vvv(traceback.format_exc())
|
||||
raise AnsibleError("Failed reading docs at %s: %s" % (plugin, to_native(e)), orig_exc=e)
|
||||
|
||||
return pfiles
|
||||
|
||||
@staticmethod
|
||||
def print_paths(finder):
|
||||
''' Returns a string suitable for printing of the search path '''
|
||||
|
@ -1045,7 +1040,7 @@ class DocCLI(CLI, RoleMixin):
|
|||
|
||||
@staticmethod
|
||||
def _dump_yaml(struct, indent):
|
||||
return DocCLI.tty_ify('\n'.join([indent + line for line in yaml.dump(struct, default_flow_style=False, Dumper=AnsibleDumper).split('\n')]))
|
||||
return DocCLI.tty_ify('\n'.join([indent + line for line in yaml_dump(struct, default_flow_style=False, Dumper=AnsibleDumper).split('\n')]))
|
||||
|
||||
@staticmethod
|
||||
def _format_version_added(version_added, version_added_collection=None):
|
||||
|
|
|
@ -12,11 +12,23 @@ from ansible.errors import AnsibleError
|
|||
from ansible.collections import is_collection_path
|
||||
from ansible.module_utils._text import to_bytes
|
||||
from ansible.utils.collection_loader import AnsibleCollectionConfig
|
||||
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path
|
||||
from ansible.utils.display import Display
|
||||
|
||||
display = Display()
|
||||
|
||||
|
||||
def list_collections(coll_filter=None, search_paths=None, dedupe=False):
|
||||
|
||||
collections = {}
|
||||
for candidate in list_collection_dirs(search_paths=search_paths, coll_filter=coll_filter):
|
||||
if os.path.exists(candidate):
|
||||
collection = _get_collection_name_from_path(candidate)
|
||||
if collection not in collections or not dedupe:
|
||||
collections[collection] = candidate
|
||||
return collections
|
||||
|
||||
|
||||
def list_valid_collection_paths(search_paths=None, warn=False):
|
||||
"""
|
||||
Filter out non existing or invalid search_paths for collections
|
||||
|
|
|
@ -13,7 +13,6 @@ from ansible.config.manager import ConfigManager
|
|||
from ansible.module_utils._text import to_text
|
||||
from ansible.module_utils.common.collections import Sequence
|
||||
from ansible.module_utils.parsing.convert_bool import BOOLEANS_TRUE
|
||||
from ansible.module_utils.six import string_types
|
||||
from ansible.release import __version__
|
||||
from ansible.utils.fqcn import add_internal_fqcns
|
||||
|
||||
|
@ -100,6 +99,11 @@ COLOR_CODES = {
|
|||
REJECT_EXTS = ('.pyc', '.pyo', '.swp', '.bak', '~', '.rpm', '.md', '.txt', '.rst')
|
||||
BOOL_TRUE = BOOLEANS_TRUE
|
||||
COLLECTION_PTYPE_COMPAT = {'module': 'modules'}
|
||||
|
||||
PYTHON_DOC_EXTENSIONS = ('.py', '.pyc', '.pyo')
|
||||
YAML_DOC_EXTENSIONS = ('.yml', '.yaml')
|
||||
DOC_EXTENSIONS = PYTHON_DOC_EXTENSIONS + YAML_DOC_EXTENSIONS
|
||||
|
||||
DEFAULT_BECOME_PASS = None
|
||||
DEFAULT_PASSWORD_CHARS = to_text(ascii_letters + digits + ".,:-_", errors='strict') # characters included in auto-generated passwords
|
||||
DEFAULT_REMOTE_PASS = None
|
||||
|
@ -107,8 +111,8 @@ DEFAULT_SUBSET = None
|
|||
# FIXME: expand to other plugins, but never doc fragments
|
||||
CONFIGURABLE_PLUGINS = ('become', 'cache', 'callback', 'cliconf', 'connection', 'httpapi', 'inventory', 'lookup', 'netconf', 'shell', 'vars')
|
||||
# NOTE: always update the docs/docsite/Makefile to match
|
||||
DOCUMENTABLE_PLUGINS = CONFIGURABLE_PLUGINS + ('module', 'strategy')
|
||||
IGNORE_FILES = ("COPYING", "CONTRIBUTING", "LICENSE", "README", "VERSION", "GUIDELINES") # ignore during module search
|
||||
DOCUMENTABLE_PLUGINS = CONFIGURABLE_PLUGINS + ('module', 'strategy', 'test', 'filter')
|
||||
IGNORE_FILES = ("COPYING", "CONTRIBUTING", "LICENSE", "README", "VERSION", "GUIDELINES", "MANIFEST", "Makefile") # ignore during module search
|
||||
INTERNAL_RESULT_KEYS = ('add_host', 'add_group')
|
||||
LOCALHOST = ('127.0.0.1', 'localhost', '::1')
|
||||
MODULE_REQUIRE_ARGS = tuple(add_internal_fqcns(('command', 'win_command', 'ansible.windows.win_command', 'shell', 'win_shell',
|
||||
|
@ -116,6 +120,7 @@ MODULE_REQUIRE_ARGS = tuple(add_internal_fqcns(('command', 'win_command', 'ansib
|
|||
MODULE_NO_JSON = tuple(add_internal_fqcns(('command', 'win_command', 'ansible.windows.win_command', 'shell', 'win_shell',
|
||||
'ansible.windows.win_shell', 'raw')))
|
||||
RESTRICTED_RESULT_KEYS = ('ansible_rsync_path', 'ansible_playbook_python', 'ansible_facts')
|
||||
SYNTHETIC_COLLECTIONS = ('ansible.builtin', 'ansible.legacy')
|
||||
TREE_DIR = None
|
||||
VAULT_VERSION_MIN = 1.0
|
||||
VAULT_VERSION_MAX = 1.0
|
||||
|
|
|
@ -155,10 +155,8 @@ class CollectionVerifyResult:
|
|||
self.success = True # type: bool
|
||||
|
||||
|
||||
def verify_local_collection(
|
||||
local_collection, remote_collection,
|
||||
artifacts_manager,
|
||||
): # type: (Candidate, Candidate | None, ConcreteArtifactsManager) -> CollectionVerifyResult
|
||||
def verify_local_collection(local_collection, remote_collection, artifacts_manager):
|
||||
# type: (Candidate, Candidate | None, ConcreteArtifactsManager) -> CollectionVerifyResult
|
||||
"""Verify integrity of the locally installed collection.
|
||||
|
||||
:param local_collection: Collection being checked.
|
||||
|
@ -168,9 +166,7 @@ def verify_local_collection(
|
|||
"""
|
||||
result = CollectionVerifyResult(local_collection.fqcn)
|
||||
|
||||
b_collection_path = to_bytes(
|
||||
local_collection.src, errors='surrogate_or_strict',
|
||||
)
|
||||
b_collection_path = to_bytes(local_collection.src, errors='surrogate_or_strict')
|
||||
|
||||
display.display("Verifying '{coll!s}'.".format(coll=local_collection))
|
||||
display.display(
|
||||
|
@ -881,10 +877,7 @@ def verify_collections(
|
|||
)
|
||||
raise
|
||||
|
||||
result = verify_local_collection(
|
||||
local_collection, remote_collection,
|
||||
artifacts_manager,
|
||||
)
|
||||
result = verify_local_collection(local_collection, remote_collection, artifacts_manager)
|
||||
|
||||
results.append(result)
|
||||
|
||||
|
@ -1245,10 +1238,7 @@ def find_existing_collections(path, artifacts_manager):
|
|||
continue
|
||||
|
||||
try:
|
||||
req = Candidate.from_dir_path_as_unknown(
|
||||
b_collection_path,
|
||||
artifacts_manager,
|
||||
)
|
||||
req = Candidate.from_dir_path_as_unknown(b_collection_path, artifacts_manager)
|
||||
except ValueError as val_err:
|
||||
raise_from(AnsibleError(val_err), val_err)
|
||||
|
||||
|
@ -1389,11 +1379,7 @@ def install_artifact(b_coll_targz_path, b_collection_path, b_temp_path, signatur
|
|||
raise
|
||||
|
||||
|
||||
def install_src(
|
||||
collection,
|
||||
b_collection_path, b_collection_output_path,
|
||||
artifacts_manager,
|
||||
):
|
||||
def install_src(collection, b_collection_path, b_collection_output_path, artifacts_manager):
|
||||
r"""Install the collection from source control into given dir.
|
||||
|
||||
Generates the Ansible collection artifact data from a galaxy.yml and
|
||||
|
|
|
@ -5,37 +5,121 @@ from __future__ import (absolute_import, division, print_function)
|
|||
__metaclass__ = type
|
||||
|
||||
import ast
|
||||
import os
|
||||
import pyclbr
|
||||
import tokenize
|
||||
|
||||
from ansible.module_utils._text import to_text
|
||||
from ansible import constants as C
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.module_utils._text import to_text, to_native
|
||||
from ansible.parsing.yaml.loader import AnsibleLoader
|
||||
from ansible.utils.display import Display
|
||||
|
||||
display = Display()
|
||||
|
||||
|
||||
# NOTE: should move to just reading the variable as we do in plugin_loader since we already load as a 'module'
|
||||
# which is much faster than ast parsing ourselves.
|
||||
def read_docstring(filename, verbose=True, ignore_errors=True):
|
||||
string_to_vars = {
|
||||
'DOCUMENTATION': 'doc',
|
||||
'EXAMPLES': 'plainexamples',
|
||||
'RETURN': 'returndocs',
|
||||
'ANSIBLE_METADATA': 'metadata', # NOTE: now unused, but kept for backwards compat
|
||||
}
|
||||
|
||||
|
||||
def _init_doc_dict():
|
||||
''' initialize a return dict for docs with the expected structure '''
|
||||
return {k: None for k in string_to_vars.values()}
|
||||
|
||||
|
||||
def read_docstring_from_yaml_file(filename, verbose=True, ignore_errors=True):
|
||||
''' Read docs from 'sidecar' yaml file doc for a plugin '''
|
||||
|
||||
global string_to_vars
|
||||
data = _init_doc_dict()
|
||||
file_data = {}
|
||||
|
||||
try:
|
||||
with open(filename, 'rb') as yamlfile:
|
||||
file_data = AnsibleLoader(yamlfile.read(), file_name=filename).get_single_data()
|
||||
|
||||
except Exception:
|
||||
if verbose:
|
||||
display.error("unable to parse %s" % filename)
|
||||
if not ignore_errors:
|
||||
raise
|
||||
|
||||
for key in string_to_vars:
|
||||
data[string_to_vars[key]] = file_data.get(key, None)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def _find_jinja_function(filename, verbose=True, ignore_errors=True):
|
||||
|
||||
# for finding filters/tests
|
||||
module_name = os.path.splitext(os.path.basename(filename))[0]
|
||||
paths = [os.path.dirname(filename)]
|
||||
mdata = pyclbr.readmodule_ex(module_name, paths)
|
||||
|
||||
|
||||
def read_docstring_from_python_module(filename, verbose=True, ignore_errors=True):
|
||||
"""
|
||||
Search for assignment of the DOCUMENTATION and EXAMPLES variables in the given file.
|
||||
Use tokenization to search for assignment of the documentation variables in the given file.
|
||||
Parse from YAML and return the resulting python structure or None together with examples as plain text.
|
||||
"""
|
||||
|
||||
found = 0
|
||||
data = _init_doc_dict()
|
||||
|
||||
next_string = None
|
||||
with tokenize.open(filename) as f:
|
||||
tokens = tokenize.generate_tokens(f.readline)
|
||||
for token in tokens:
|
||||
if token.type == tokenize.NAME:
|
||||
if token.string in string_to_vars:
|
||||
next_string = string_to_vars[token.string]
|
||||
|
||||
if next_string is not None and token.type == tokenize.STRING:
|
||||
|
||||
found += 1
|
||||
|
||||
value = token.string
|
||||
if value.startswith(('r', 'b')):
|
||||
value = value.lstrip('rb')
|
||||
|
||||
if value.startswith(("'", '"')):
|
||||
value = value.strip("'\"")
|
||||
|
||||
if next_string == 'plainexamples':
|
||||
# keep as string
|
||||
data[next_string] = to_text(value)
|
||||
else:
|
||||
try:
|
||||
data[next_string] = AnsibleLoader(value, file_name=filename).get_single_data()
|
||||
except Exception as e:
|
||||
if ignore_errors:
|
||||
if verbose:
|
||||
display.error("unable to parse %s" % filename)
|
||||
else:
|
||||
raise
|
||||
|
||||
next_string = None
|
||||
|
||||
# if nothing else worked, fall back to old method
|
||||
if not found:
|
||||
data = read_docstring_from_python_file(filename, verbose, ignore_errors)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def read_docstring_from_python_file(filename, verbose=True, ignore_errors=True):
|
||||
"""
|
||||
Use ast to search for assignment of the DOCUMENTATION and EXAMPLES variables in the given file.
|
||||
Parse DOCUMENTATION from YAML and return the YAML doc or None together with EXAMPLES, as plain text.
|
||||
"""
|
||||
|
||||
data = {
|
||||
'doc': None,
|
||||
'plainexamples': None,
|
||||
'returndocs': None,
|
||||
'metadata': None, # NOTE: not used anymore, kept for compat
|
||||
'seealso': None,
|
||||
}
|
||||
|
||||
string_to_vars = {
|
||||
'DOCUMENTATION': 'doc',
|
||||
'EXAMPLES': 'plainexamples',
|
||||
'RETURN': 'returndocs',
|
||||
'ANSIBLE_METADATA': 'metadata', # NOTE: now unused, but kept for backwards compat
|
||||
}
|
||||
data = _init_doc_dict()
|
||||
global string_to_vars
|
||||
|
||||
try:
|
||||
with open(filename, 'rb') as b_module_data:
|
||||
|
@ -75,6 +159,27 @@ def read_docstring(filename, verbose=True, ignore_errors=True):
|
|||
return data
|
||||
|
||||
|
||||
def read_docstring(filename, verbose=True, ignore_errors=True):
|
||||
''' returns a documentation dictionary from Ansible plugin docstrings '''
|
||||
|
||||
# TODO: ensure adjacency to code (including ps1 for py files)
|
||||
if filename.endswith(C.YAML_DOC_EXTENSIONS):
|
||||
docstring = read_docstring_from_yaml_file(filename, verbose=verbose, ignore_errors=ignore_errors)
|
||||
elif filename.endswith(C.PYTHON_DOC_EXTENSIONS):
|
||||
docstring = read_docstring_from_python_module(filename, verbose=verbose, ignore_errors=ignore_errors)
|
||||
elif not ignore_errors:
|
||||
raise AnsibleError("Unknown documentation format: %s" % to_native(filename))
|
||||
|
||||
if not docstring and not ignore_errors:
|
||||
raise AnsibleError("Unable to parse documentation for: %s" % to_native(filename))
|
||||
|
||||
# cause seealso is specially processed from 'doc' later on
|
||||
# TODO: stop any other 'overloaded' implementation in main doc
|
||||
docstring['seealso'] = None
|
||||
|
||||
return docstring
|
||||
|
||||
|
||||
def read_docstub(filename):
|
||||
"""
|
||||
Quickly find short_description using string methods instead of node parsing.
|
||||
|
@ -104,7 +209,7 @@ def read_docstub(filename):
|
|||
indent_detection = ' ' * (len(line) - len(line.lstrip()) + 1)
|
||||
doc_stub.append(line)
|
||||
|
||||
elif line.startswith('DOCUMENTATION') and '=' in line:
|
||||
elif line.startswith('DOCUMENTATION') and ('=' in line or ':' in line):
|
||||
in_documentation = True
|
||||
|
||||
short_description = r''.join(doc_stub).strip().rstrip('.')
|
||||
|
|
|
@ -0,0 +1,44 @@
|
|||
DOCUMENTATION:
|
||||
name: ternary
|
||||
author: Brian Coca (@bcoca)
|
||||
version_added: '1.9'
|
||||
short_description: Ternary operation filter
|
||||
description:
|
||||
- Return the first value if the input is C(True), the second if C(False).
|
||||
positional: true_val, false_val
|
||||
options:
|
||||
_input:
|
||||
description: A boolean expression, must evaluate to C(True) or C(False).
|
||||
type: bool
|
||||
required: true
|
||||
true_val:
|
||||
description: Value to return if the input is C(True).
|
||||
type: any
|
||||
required: true
|
||||
false_val:
|
||||
description: Value to return if the input is C(False).
|
||||
type: any
|
||||
none_val:
|
||||
description: Value to return if the input is C(None). If not set, C(None) will be treated as C(False).
|
||||
type: any
|
||||
version_added: '2.8'
|
||||
notes:
|
||||
- vars as values are evaluated even if not returned. This is due to them being evaluated before being passed into the filter.
|
||||
|
||||
EXAMPLES: |
|
||||
# set first 10 volumes rw, rest as dp
|
||||
volume_mode: "{{ (item|int < 11)|ternary('rw', 'dp') }}"
|
||||
|
||||
# choose correct vpc subnet id, note that vars as values are evaluated even if not returned
|
||||
vpc_subnet_id: "{{ (ec2_subnet_type == 'public') | ternary(ec2_vpc_public_subnet_id, ec2_vpc_private_subnet_id) }}"
|
||||
|
||||
- name: service-foo, use systemd module unless upstart is present, then use old service module
|
||||
service:
|
||||
state: restarted
|
||||
enabled: yes
|
||||
use: "{{ (ansible_service_mgr == 'upstart') | ternary('service', 'systemd') }}"
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The value indicated by the input.
|
||||
type: any
|
|
@ -0,0 +1,69 @@
|
|||
DOCUMENTATION:
|
||||
name: to_json
|
||||
author: core team
|
||||
version_added: 'historical'
|
||||
short_description: Convert variable to JSON string
|
||||
description:
|
||||
- Converts an Ansible variable into a JSON string representation.
|
||||
- This filter functions as a wrapper to the Python ``json.dumps()`` function.
|
||||
- Ansible internally auto-converts JSON strings into variable structures so this plugin is used to force it into a JSON string.
|
||||
options:
|
||||
_input:
|
||||
description: A variable or expression that returns a data structure.
|
||||
type: raw
|
||||
required: true
|
||||
vault_to_text:
|
||||
description: Toggle to either unvault a vault or create the JSON version of a vaulted object
|
||||
type: bool
|
||||
default: True
|
||||
version_added: '2.9'
|
||||
preprocess_unsafe:
|
||||
description: Toggle to represent unsafe values directly in JSON or create a unsafe object in JSON
|
||||
type: bool
|
||||
default: True
|
||||
version_added: '2.9'
|
||||
allow_nan:
|
||||
description: When off, strict adherence to float value limits of the JSON spec, so C(nan), C(inf) and C(-inf) values will produce errors
|
||||
If on, JavaScript equivalents will be used (C(NaN), C(Infinity), C(-Infinity))
|
||||
default: True
|
||||
type: bool
|
||||
check_circular:
|
||||
description: Controls the usage of the internal circular reference detection, if off can result in overflow errors.
|
||||
default: True
|
||||
type: bool
|
||||
ensure_ascii:
|
||||
description: Escapes all non ASCII characters
|
||||
default: True
|
||||
type: bool
|
||||
indent:
|
||||
description: Number of spaces to indent python structures, mainly used for display to humans
|
||||
default: 0
|
||||
type: integer
|
||||
separators:
|
||||
description: The C(item) and C(key) separator to be used in the serialized output,
|
||||
default may change depending on I(indent) and Python version
|
||||
default: "(', ', ': ')"
|
||||
type: tuple
|
||||
skipkeys:
|
||||
description: If C(True), keys that are not basic Python types will be skipped
|
||||
default: False
|
||||
type: bool
|
||||
sort_keys:
|
||||
description: Affects sorting of dictionary keys
|
||||
default: False
|
||||
type: bool
|
||||
notes:
|
||||
- Both I(vault_to_text) and I(preprocess_unsafe) defaulted to C(False) between Ansible 2.9 and 2.12
|
||||
- 'These parameters to ``json.dumps()`` will be ignored, as they are overriden internally: I(cls), I(default)'
|
||||
|
||||
EXAMPLES: |
|
||||
# dump variable in a template to create a JSON document
|
||||
{{ docker_config|to_json }}
|
||||
|
||||
# same as above but 'prettier' (equivalent to to_nice_json filter)
|
||||
{{ docker_config|to_json(indent=4, sort_keys=True) }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The JSON serialized string representing the variable structure inputted
|
||||
type: string
|
|
@ -0,0 +1,54 @@
|
|||
DOCUMENTATION:
|
||||
name: to_nice_json
|
||||
author: core team
|
||||
version_added: 'historical'
|
||||
short_description: Convert variable to 'nicely formatted' JSON string
|
||||
description:
|
||||
- Converts an Ansible variable into a 'nicely formatted' JSON string representation
|
||||
- This filter functions as a wrapper to the Python ``json.dumps()`` function
|
||||
- Ansible internally auto-converts JSON strings into variable structures so this plugin is used to force it into a JSON string
|
||||
options:
|
||||
_input:
|
||||
description: A variable or expression that returns a data structure
|
||||
type: raw
|
||||
required: true
|
||||
vault_to_text:
|
||||
description: Toggle to either unvault a vault or create the JSON version of a vaulted object
|
||||
type: bool
|
||||
default: True
|
||||
version_added: '2.9'
|
||||
preprocess_unsafe:
|
||||
description: Toggle to represent unsafe values directly in JSON or create a unsafe object in JSON
|
||||
type: bool
|
||||
default: True
|
||||
version_added: '2.9'
|
||||
allow_nan:
|
||||
description: When off, strict adherence to float value limits of the JSON spec, so C(nan), C(inf) and C(-inf) values will produce errors
|
||||
If on, JavaScript equivalents will be used (C(NaN), C(Infinity), C(-Infinity)).
|
||||
default: True
|
||||
type: bool
|
||||
check_circular:
|
||||
description: Controls the usage of the internal circular reference detection, if off can result in overflow errors.
|
||||
default: True
|
||||
type: bool
|
||||
ensure_ascii:
|
||||
description: Escapes all non ASCII characters
|
||||
default: True
|
||||
type: bool
|
||||
skipkeys:
|
||||
description: If C(True), keys that are not basic Python types will be skipped.
|
||||
default: False
|
||||
type: bool
|
||||
notes:
|
||||
- Both I(vault_to_text) and I(preprocess_unsafe) defaulted to C(False) between Ansible 2.9 and 2.12
|
||||
- 'These parameters to ``json.dumps()`` will be ignored, they are overriden for internal use: I(cls), I(default), I(indent), I(separators), I(sort_keys).'
|
||||
|
||||
EXAMPLES: |
|
||||
# dump variable in a template to create a nicely formatted JSON document
|
||||
{{ docker_config|to_nice_json }}
|
||||
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The 'nicely formatted' JSON serialized string representing the variable structure inputted
|
||||
type: string
|
|
@ -0,0 +1,20 @@
|
|||
DOCUMENTATION:
|
||||
name: type_debug
|
||||
author: Adrian Likins (@alikins)
|
||||
version_added: "2.3"
|
||||
short_description: show input data type
|
||||
description:
|
||||
- Returns the equivalent of Python's ``type()`` function
|
||||
options:
|
||||
_input:
|
||||
description: Variable or expression of which you want to determine type.
|
||||
type: any
|
||||
required: true
|
||||
EXAMPLES: |
|
||||
# get type of 'myvar'
|
||||
{{ myvar | type_debug }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The Python 'type' of the I(_input) provided.
|
||||
type: string
|
|
@ -0,0 +1,36 @@
|
|||
DOCUMENTATION:
|
||||
name: unvault
|
||||
author: Brian Coca (@bcoca)
|
||||
version_added: "2.12"
|
||||
short_description: Open an Ansible Vault
|
||||
description:
|
||||
- Retrieve your information from an encrypted Ansible Vault
|
||||
positional: secret
|
||||
options:
|
||||
_input:
|
||||
description: Vault string, or an C(AnsibleVaultEncryptedUnicode) string object.
|
||||
type: string
|
||||
required: true
|
||||
secret:
|
||||
description: Vault secret, the key that lets you open the vault
|
||||
type: string
|
||||
required: true
|
||||
vault_id:
|
||||
description: Secret identifier, used internally to try to best match a secret when multiple are provided
|
||||
type: string
|
||||
default: 'filter_default'
|
||||
|
||||
EXAMPLES: |
|
||||
# simply decrypt my key from a vault
|
||||
vars:
|
||||
mykey: "{{ myvaultedkey|unvault(passphrase) }} "
|
||||
|
||||
- name: save templated unvaulted data
|
||||
template: src=dump_template_data.j2 dest=/some/key/clear.txt
|
||||
vars:
|
||||
template_data: '{{ secretdata|uvault(vaultsecret) }}'
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The string that was contained in the vault.
|
||||
type: string
|
|
@ -0,0 +1,48 @@
|
|||
DOCUMENTATION:
|
||||
name: vault
|
||||
author: Brian Coca (@bcoca)
|
||||
version_added: "2.12"
|
||||
short_description: vault your secrets
|
||||
description:
|
||||
- Put your information into an encrypted Ansible Vault
|
||||
positional: secret
|
||||
options:
|
||||
_input:
|
||||
description: data to vault
|
||||
type: string
|
||||
required: true
|
||||
secret:
|
||||
description: Vault secret, the key that lets you open the vault
|
||||
type: string
|
||||
required: true
|
||||
salt:
|
||||
description:
|
||||
- Encryption salt, will be random if not provided
|
||||
- While providing one makes the resulting encrypted string reproducible, it can lower the security of the vault
|
||||
type: string
|
||||
vault_id:
|
||||
description: Secret identifier, used internally to try to best match a secret when multiple are provided
|
||||
type: string
|
||||
default: 'filter_default'
|
||||
wrap_object:
|
||||
description:
|
||||
- This toggle can force the return of an C(AnsibleVaultEncryptedUnicode) string object, when C(False), you get a simple string
|
||||
- Mostly useful when combining with the C(to_yaml) filter to output the 'inline vault' format.
|
||||
type: bool
|
||||
default: False
|
||||
|
||||
EXAMPLES: |
|
||||
# simply encrypt my key in a vault
|
||||
vars:
|
||||
myvaultedkey: "{{ keyrawdata|vault(passphrase) }} "
|
||||
|
||||
- name: save templated vaulted data
|
||||
template: src=dump_template_data.j2 dest=/some/key/vault.txt
|
||||
vars:
|
||||
mysalt: '{{2**256|random(seed=inventory_hostname)}}'
|
||||
template_data: '{{ secretdata|vault(vaultsecret, salt=mysalt) }}'
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: The vault string that contains the secret data (or AnsibleVaultEncryptedUnicode string object)
|
||||
type: string
|
|
@ -0,0 +1,233 @@
|
|||
# (c) Ansible Project
|
||||
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
|
||||
from __future__ import (absolute_import, division, print_function)
|
||||
__metaclass__ = type
|
||||
|
||||
|
||||
import os
|
||||
|
||||
|
||||
from ansible import constants as C
|
||||
from ansible.collections.list import list_collections
|
||||
from ansible.errors import AnsibleError
|
||||
from ansible.module_utils._text import to_native, to_bytes
|
||||
from ansible.plugins import loader
|
||||
from ansible.utils.display import Display
|
||||
from ansible.utils.path import is_subpath
|
||||
from ansible.utils.collection_loader._collection_finder import _get_collection_path
|
||||
|
||||
display = Display()
|
||||
|
||||
# not real plugins
|
||||
IGNORE = {
|
||||
# ptype: names
|
||||
'module': ('async_wrapper', ),
|
||||
'cache': ('base', ),
|
||||
}
|
||||
|
||||
|
||||
def _list_plugins_from_paths(ptype, dirs, collection, depth=0):
|
||||
|
||||
plugins = {}
|
||||
|
||||
for path in dirs:
|
||||
display.debug("Searching '{0}'s '{1}' for {2} plugins".format(collection, path, ptype))
|
||||
b_path = to_bytes(path)
|
||||
|
||||
if os.path.basename(b_path).startswith((b'.', b'__')):
|
||||
# skip hidden/special dirs
|
||||
continue
|
||||
|
||||
if os.path.exists(b_path):
|
||||
if os.path.isdir(b_path):
|
||||
bkey = ptype.lower()
|
||||
for plugin_file in os.listdir(b_path):
|
||||
|
||||
if plugin_file.startswith((b'.', b'__')):
|
||||
# hidden or python internal file/dir
|
||||
continue
|
||||
|
||||
display.debug("Found possible plugin: '{0}'".format(plugin_file))
|
||||
b_plugin, ext = os.path.splitext(plugin_file)
|
||||
plugin = to_native(b_plugin)
|
||||
full_path = os.path.join(b_path, plugin_file)
|
||||
|
||||
if os.path.isdir(full_path):
|
||||
# its a dir, recurse
|
||||
if collection in C.SYNTHETIC_COLLECTIONS:
|
||||
if not os.path.exists(os.path.join(full_path, b'__init__.py')):
|
||||
# dont recurse for synthetic unless init.py present
|
||||
continue
|
||||
|
||||
# actually recurse dirs
|
||||
plugins.update(_list_plugins_from_paths(ptype, [to_native(full_path)], collection, depth=depth + 1))
|
||||
else:
|
||||
if any([
|
||||
plugin in C.IGNORE_FILES, # general files to ignore
|
||||
ext in C.REJECT_EXTS, # general extensions to ignore
|
||||
plugin in IGNORE.get(bkey, ()), # plugin in reject list
|
||||
]):
|
||||
continue
|
||||
|
||||
if ptype in ('test', 'filter'):
|
||||
ploader = getattr(loader, '{0}_loader'.format(ptype))
|
||||
|
||||
if ptype == 'filter':
|
||||
method_name = 'filters'
|
||||
elif ptype == 'test':
|
||||
method_name = 'tests'
|
||||
else:
|
||||
raise AnsibleError('how did you get here?')
|
||||
|
||||
try:
|
||||
added = False
|
||||
if path not in ploader._extra_dirs:
|
||||
ploader.add_directory(path)
|
||||
added = True
|
||||
for plugin_map in ploader.all():
|
||||
if not is_subpath(plugin_map._original_path, path, real=True):
|
||||
# loader will not restrict to collection so we need to do it here
|
||||
# requires both to be 'real' since loader solves symlinks
|
||||
continue
|
||||
try:
|
||||
# uses the jinja2 method tests/filters to get 'name -> function' map
|
||||
method_map = getattr(plugin_map, method_name)
|
||||
jplugins = method_map()
|
||||
seen = set()
|
||||
# skip aliases, names that reference same function
|
||||
for candidate in jplugins:
|
||||
if jplugins[candidate] not in seen:
|
||||
# use names and associate to actual file instead of 'function'
|
||||
composite = [collection]
|
||||
if depth:
|
||||
composite.extend(plugin_map._original_path.split(os.path.sep)[depth * -1:])
|
||||
composite.append(to_native(candidate))
|
||||
fqcn = '.'.join(composite)
|
||||
plugins[fqcn] = plugin_map._original_path
|
||||
seen.add(jplugins[candidate])
|
||||
except Exception as e:
|
||||
display.warning("Skipping plugin file %s as it seems to be invalid: %r" % (to_native(plugin_map._original_path), e))
|
||||
finally:
|
||||
if added:
|
||||
ploader._extra_dirs.remove(os.path.realpath(path))
|
||||
ploader._clear_caches()
|
||||
else:
|
||||
# collectionize name
|
||||
composite = [collection]
|
||||
if depth:
|
||||
composite.extend(path.split(os.path.sep)[depth * -1:])
|
||||
composite.append(to_native(plugin))
|
||||
plugin = '.'.join(composite)
|
||||
|
||||
if not os.path.islink(full_path):
|
||||
# skip aliases, author should document in 'aliaes' field
|
||||
plugins[plugin] = full_path
|
||||
else:
|
||||
display.debug("Skip listing plugins in '{0}' as it is not a directory".format(path))
|
||||
else:
|
||||
display.debug("Skip listing plugins in '{0}' as it does not exist".format(path))
|
||||
|
||||
return plugins
|
||||
|
||||
|
||||
def list_collection_plugins(ptype, collections, search_paths=None):
|
||||
|
||||
# starts at {plugin_name: filepath, ...}, but changes at the end
|
||||
plugins = {}
|
||||
dirs = []
|
||||
try:
|
||||
ploader = getattr(loader, '{0}_loader'.format(ptype))
|
||||
except AttributeError:
|
||||
raise AnsibleError('Cannot list plugins, incorrect plugin type supplied: {0}'.format(ptype))
|
||||
|
||||
# get plugins for each collection
|
||||
for collection in collections.keys():
|
||||
if collection == 'ansible.builtin':
|
||||
# dirs from ansible install, but not configured paths
|
||||
dirs.extend([d.path for d in ploader._get_paths_with_context() if d.path not in ploader.config])
|
||||
elif collection == 'ansible.legacy':
|
||||
# configured paths + search paths (should include basedirs/-M)
|
||||
dirs = ploader.config
|
||||
if search_paths is not None:
|
||||
for d in search_paths:
|
||||
if not d.endswith(ploader.subdir):
|
||||
d = os.path.join([d, ploader.subdir])
|
||||
dirs.append(d)
|
||||
else:
|
||||
# search path in this case is for locating collection itself
|
||||
b_ptype = to_bytes(C.COLLECTION_PTYPE_COMPAT.get(ptype, ptype))
|
||||
dirs = [to_native(os.path.join(collections[collection], b'plugins', b_ptype))]
|
||||
|
||||
plugins.update(_list_plugins_from_paths(ptype, dirs, collection))
|
||||
|
||||
# return plugin and it's class object, None for those not verifiable or failing
|
||||
if ptype in ('module',):
|
||||
# no 'invalid' tests for modules
|
||||
for plugin in plugins.keys():
|
||||
plugins[plugin] = (plugins[plugin], None)
|
||||
else:
|
||||
# detect invalid plugin candidates AND add loaded object to return data
|
||||
for plugin in list(plugins.keys()):
|
||||
pobj = None
|
||||
try:
|
||||
pobj = ploader.get(plugin, class_only=True)
|
||||
except Exception as e:
|
||||
display.vvv("The '{0}' {1} plugin could not be loaded from '{2}': {3}".format(plugin, ptype, plugins[plugin], to_native(e)))
|
||||
|
||||
# sets final {plugin_name: (filepath, class|NONE if not loaded), ...}
|
||||
plugins[plugin] = (plugins[plugin], pobj)
|
||||
|
||||
# {plugin_name: (filepath, class), ...}
|
||||
return plugins
|
||||
|
||||
|
||||
def list_plugins(ptype, collection=None, search_paths=None):
|
||||
|
||||
# {plugin_name: (filepath, class), ...}
|
||||
plugins = {}
|
||||
do_legacy = False
|
||||
collections = {}
|
||||
if collection is None:
|
||||
# list all collections
|
||||
collections['ansible.builtin'] = b''
|
||||
collections.update(list_collections(search_paths=search_paths, dedupe=True))
|
||||
do_legacy = True
|
||||
elif collection == 'ansilbe.builtin':
|
||||
collections['ansible.builtin'] = b''
|
||||
elif collection == 'ansible.legacy':
|
||||
do_legacy = True
|
||||
else:
|
||||
try:
|
||||
collections[collection] = to_bytes(_get_collection_path(collection))
|
||||
except ValueError as e:
|
||||
raise AnsibleError("Cannot use supplied collection {0}: {1}".format(collection, to_native(e)), orig_exc=e)
|
||||
|
||||
if collections:
|
||||
plugins.update(list_collection_plugins(ptype, collections))
|
||||
|
||||
if do_legacy:
|
||||
legacy = list_collection_plugins(ptype, {'ansible.legacy': search_paths})
|
||||
for plugin in legacy.keys():
|
||||
builtin = plugin.replace('ansible.legacy.', 'ansible.builtin.', 1)
|
||||
if builtin in plugins and legacy[plugin][0] == plugins[builtin][0]:
|
||||
# add only if no overlap or overlap but diff files
|
||||
continue
|
||||
plugins[plugin] = legacy[plugin]
|
||||
|
||||
return plugins
|
||||
|
||||
|
||||
# wrappers
|
||||
def list_plugin_names(ptype, collection=None):
|
||||
return list_plugins(ptype, collection).keys()
|
||||
|
||||
|
||||
def list_plugin_files(ptype, collection=None):
|
||||
plugins = list_plugins(ptype, collection)
|
||||
return [plugins[k][0] for k in plugins.keys()]
|
||||
|
||||
|
||||
def list_plugin_classes(ptype, collection=None):
|
||||
plugins = list_plugins(ptype, collection)
|
||||
return [plugins[k][1] for k in plugins.keys()]
|
|
@ -995,13 +995,11 @@ class Jinja2Loader(PluginLoader):
|
|||
|
||||
We can't use the base class version because of file == plugin assumptions and dedupe logic
|
||||
"""
|
||||
def find_plugin(self, name, collection_list=None):
|
||||
def find_plugin(self, name, mod_type='', ignore_deprecated=False, check_aliases=False, collection_list=None):
|
||||
''' this is really 'find plugin file' '''
|
||||
|
||||
if '.' in name: # NOTE: this is wrong way, use: AnsibleCollectionRef.is_valid_fqcr(name) or collection_list
|
||||
return super(Jinja2Loader, self).find_plugin(name, collection_list=collection_list)
|
||||
|
||||
# Nothing is currently using this method
|
||||
raise AnsibleError('No code should call "find_plugin" for Jinja2Loaders (Not implemented)')
|
||||
return super(Jinja2Loader, self).find_plugin(name, mod_type=mod_type, ignore_deprecated=ignore_deprecated, check_aliases=check_aliases,
|
||||
collection_list=collection_list)
|
||||
|
||||
def get(self, name, *args, **kwargs):
|
||||
|
||||
|
@ -1009,7 +1007,7 @@ class Jinja2Loader(PluginLoader):
|
|||
return super(Jinja2Loader, self).get(name, *args, **kwargs)
|
||||
|
||||
# Nothing is currently using this method
|
||||
raise AnsibleError('No code should call "get" for Jinja2Loaders (Not implemented)')
|
||||
raise AnsibleError('No code should call "get" for Jinja2Loaders (Not implemented) for non collection use')
|
||||
|
||||
def all(self, *args, **kwargs):
|
||||
"""
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
changed.yml
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: changed
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check if task required changes
|
||||
description:
|
||||
- Tests if task required changes to complete
|
||||
- This test checks for the existance of a C(changed) key in the input dictionary and that it is C(True) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is changed }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task was required changes, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: failed
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check if task failed
|
||||
description:
|
||||
- Tests if task finished in failure , opposite of C(success).
|
||||
- This test checks for the existance of a C(failed) key in the input dictionary and that it is C(True) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is failed }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(False) if the task was completed as a failure, C(True) if otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1 @@
|
|||
failed.yml
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: finished
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check if a task has finished
|
||||
description:
|
||||
- Used to test if an async task has finished, it will aslo work with normal tasks but will issue a warning.
|
||||
- This test checks for the existance of a C(finished) key in the input dictionary and that it is C(1) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (asynctaskpoll is finished}}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the aysnc task has finished, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: reachable
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check task didn't return that host was unreachable
|
||||
description:
|
||||
- Tests if task was able to reach the host for execution
|
||||
- This test checks for the existance of a C(unreachable) key in the input dictionary and that it is C(False) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is reachable }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task did not flag the host as unreachable, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1 @@
|
|||
skipped.yml
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: skipped
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check if task was skipped
|
||||
description:
|
||||
- Tests if task was skipped
|
||||
- This test checks for the existance of a C(skipped) key in the input dictionary and that it is C(True) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is skipped}}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task was skipped, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: started
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check if a task has started
|
||||
description:
|
||||
- Used to check if an async task has started, will also work with non async tasks but will issue a warning.
|
||||
- This test checks for the existance of a C(started) key in the input dictionary and that it is C(1) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (asynctaskpoll is started}}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task has started, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1 @@
|
|||
success.yml
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: success
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check task success
|
||||
description:
|
||||
- Tests if task finished successfully, opposite of C(failed).
|
||||
- This test checks for the existance of a C(failed) key in the input dictionary and that it is C(False) if present
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is success }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task was successfully completed, C(False) otherwise.
|
||||
type: boolean
|
|
@ -0,0 +1 @@
|
|||
success.yml
|
|
@ -0,0 +1,21 @@
|
|||
DOCUMENTATION:
|
||||
name: unreachable
|
||||
author: Ansible Core
|
||||
version_added: "1.9"
|
||||
short_description: check task returned that the host was unreachable
|
||||
description:
|
||||
- Tests if task was not able to reach the host for execution
|
||||
- This test checks for the existance of a C(unreachable) key in the input dictionary and that it's value is C(True)
|
||||
options:
|
||||
_input:
|
||||
description: registered result from an Ansible task
|
||||
type: dictionary
|
||||
required: True
|
||||
EXAMPLES: |
|
||||
# test 'status' to know how to respond
|
||||
{{ (taskresults is unreachable }}
|
||||
|
||||
RETURN:
|
||||
_value:
|
||||
description: Returns C(True) if the task flagged the host as unreachable, C(False) otherwise.
|
||||
type: boolean
|
|
@ -960,6 +960,18 @@ class AnsibleCollectionRef:
|
|||
)
|
||||
|
||||
|
||||
def _get_collection_path(collection_name):
|
||||
collection_name = to_native(collection_name)
|
||||
if not collection_name or not isinstance(collection_name, string_types) or len(collection_name.split('.')) != 2:
|
||||
raise ValueError('collection_name must be a non-empty string of the form namespace.collection')
|
||||
try:
|
||||
collection_pkg = import_module('ansible_collections.' + collection_name)
|
||||
except ImportError:
|
||||
raise ValueError('unable to locate collection {0}'.format(collection_name))
|
||||
|
||||
return to_native(os.path.dirname(to_bytes(collection_pkg.__file__)))
|
||||
|
||||
|
||||
def _get_collection_playbook_path(playbook):
|
||||
|
||||
acr = AnsibleCollectionRef.try_parse_fqcr(playbook, u'playbook')
|
||||
|
|
|
@ -134,7 +134,7 @@ def cleanup_tmp_file(path, warn=False):
|
|||
pass
|
||||
|
||||
|
||||
def is_subpath(child, parent):
|
||||
def is_subpath(child, parent, real=False):
|
||||
"""
|
||||
Compares paths to check if one is contained in the other
|
||||
:arg: child: Path to test
|
||||
|
@ -145,6 +145,10 @@ def is_subpath(child, parent):
|
|||
abs_child = unfrackpath(child, follow=False)
|
||||
abs_parent = unfrackpath(parent, follow=False)
|
||||
|
||||
if real:
|
||||
abs_child = os.path.realpath(abs_child)
|
||||
abs_parent = os.path.realpath(abs_parent)
|
||||
|
||||
c = abs_child.split(os.path.sep)
|
||||
p = abs_parent.split(os.path.sep)
|
||||
|
||||
|
|
|
@ -19,13 +19,6 @@ from ansible.utils.vars import combine_vars
|
|||
display = Display()
|
||||
|
||||
|
||||
# modules that are ok that they do not have documentation strings
|
||||
REJECTLIST = {
|
||||
'MODULE': frozenset(('async_wrapper',)),
|
||||
'CACHE': frozenset(('base',)),
|
||||
}
|
||||
|
||||
|
||||
def merge_fragment(target, source):
|
||||
|
||||
for key, value in source.items():
|
||||
|
@ -214,11 +207,20 @@ def add_fragments(doc, filename, fragment_loader, is_module=False):
|
|||
raise AnsibleError('unknown doc_fragment(s) in file {0}: {1}'.format(filename, to_native(', '.join(unknown_fragments))))
|
||||
|
||||
|
||||
def get_docstring(filename, fragment_loader, verbose=False, ignore_errors=False, collection_name=None, is_module=False):
|
||||
def get_docstring(filename, fragment_loader, verbose=False, ignore_errors=False, collection_name=None, is_module=None, plugin_type=None):
|
||||
"""
|
||||
DOCUMENTATION can be extended using documentation fragments loaded by the PluginLoader from the doc_fragments plugins.
|
||||
"""
|
||||
|
||||
if is_module is None:
|
||||
if plugin_type is None:
|
||||
is_module = False
|
||||
else:
|
||||
is_module = (plugin_type == 'module')
|
||||
else:
|
||||
# TODO deprecate is_module argument, now that we have 'type'
|
||||
pass
|
||||
|
||||
data = read_docstring(filename, verbose=verbose, ignore_errors=ignore_errors)
|
||||
|
||||
if data.get('doc', False):
|
||||
|
|
|
@ -43,9 +43,8 @@ do
|
|||
justcol=$(ansible-doc -l -t ${ptype} --playbook-dir ./ testns.testcol|wc -l)
|
||||
test "$justcol" -eq 1
|
||||
|
||||
# ensure we get 0 plugins when restricting to collection, but not supplying it
|
||||
justcol=$(ansible-doc -l -t ${ptype} testns.testcol|wc -l)
|
||||
test "$justcol" -eq 0
|
||||
# ensure we get error if passinginvalid collection, much less any plugins
|
||||
ansible-doc -l -t ${ptype} testns.testcol 2>&1 | grep "unable to locate collection"
|
||||
|
||||
# TODO: do we want per namespace?
|
||||
# ensure we get 1 plugins when restricting namespace
|
||||
|
|
|
@ -71,7 +71,8 @@ from ansible.module_utils.compat.version import StrictVersion, LooseVersion
|
|||
from ansible.module_utils.basic import to_bytes
|
||||
from ansible.module_utils.six import PY3, with_metaclass, string_types
|
||||
from ansible.plugins.loader import fragment_loader
|
||||
from ansible.utils.plugin_docs import REJECTLIST, add_collection_to_versions_and_dates, add_fragments, get_docstring
|
||||
from ansible.plugins.list import IGNORE as REJECTLIST
|
||||
from ansible.utils.plugin_docs import add_collection_to_versions_and_dates, add_fragments, get_docstring
|
||||
from ansible.utils.version import SemanticVersion
|
||||
|
||||
from .module_args import AnsibleModuleImportError, AnsibleModuleNotInitialized, get_argument_spec
|
||||
|
@ -294,7 +295,7 @@ class ModuleValidator(Validator):
|
|||
REJECTLIST_FILES = frozenset(('.git', '.gitignore', '.travis.yml',
|
||||
'.gitattributes', '.gitmodules', 'COPYING',
|
||||
'__init__.py', 'VERSION', 'test-docs.sh'))
|
||||
REJECTLIST = REJECTLIST_FILES.union(REJECTLIST['MODULE'])
|
||||
REJECTLIST = REJECTLIST_FILES.union(REJECTLIST['module'])
|
||||
|
||||
PS_DOC_REJECTLIST = frozenset((
|
||||
'async_status.ps1',
|
||||
|
|
|
@ -24,6 +24,8 @@ def main():
|
|||
'lib/ansible/galaxy/data/',
|
||||
)
|
||||
|
||||
allow_yaml = ('lib/ansible/plugins/test', 'lib/ansible/plugins/filter')
|
||||
|
||||
for path in paths:
|
||||
if path in skip_paths:
|
||||
continue
|
||||
|
@ -36,6 +38,8 @@ def main():
|
|||
continue
|
||||
|
||||
ext = os.path.splitext(path)[1]
|
||||
if ext in ('.yml', ) and any(path.startswith(yaml_directory) for yaml_directory in allow_yaml):
|
||||
continue
|
||||
|
||||
if ext not in allowed_extensions:
|
||||
print('%s: extension must be one of: %s' % (path, ', '.join(allowed_extensions)))
|
||||
|
|
|
@ -127,4 +127,4 @@ def test_legacy_modules_list():
|
|||
obj = DocCLI(args=args)
|
||||
obj.parse()
|
||||
result = obj._list_plugins('module', module_loader)
|
||||
assert len(result) > 0
|
||||
assert len(result) == 0
|
||||
|
|
Loading…
Reference in New Issue