OpenXR 1.0.2 release (27-August-2019)
Patch release for the 1.0 series. Updates version to 1.0.2. ### Public issues - Pull request #30 - Fix parameter name typo in XR_MSFT_spatial_anchor ### Internal issues - Enhance xml_consistency script. (Internal MR 1526) - Sync scripts from Vulkan. (Internal MR 1514) - Port the equivalent of Vulkan's internal MR 3319 to OpenXR, affecting empty bitmask generated implicit valid usage. (Internal MR 1513) - Fix error in extension-added function. (Internal MR 1510) - Add Oculus Android extension. (Internal MR 1518) - Reserve additional extension number for Oculus. (Internal MR 1517)
This commit is contained in:
parent
186b101132
commit
4522df7189
|
@ -11,6 +11,26 @@ along with any public pull requests that have been accepted.
|
|||
This changelog only lists changes that affect the registry,
|
||||
headers, and/or specification text.
|
||||
|
||||
## OpenXR 1.0.2 release (27-August-2019)
|
||||
|
||||
Patch release for the 1.0 series.
|
||||
|
||||
Updates version to 1.0.2.
|
||||
|
||||
### Public issues
|
||||
|
||||
- Pull request #30 - Fix parameter name typo in XR_MSFT_spatial_anchor
|
||||
|
||||
### Internal issues
|
||||
|
||||
- Enhance xml_consistency script. (Internal MR 1526)
|
||||
- Sync scripts from Vulkan. (Internal MR 1514)
|
||||
- Port the equivalent of Vulkan's internal MR 3319 to OpenXR,
|
||||
affecting empty bitmask generated implicit valid usage. (Internal MR 1513)
|
||||
- Fix error in extension-added function. (Internal MR 1510)
|
||||
- Add Oculus Android extension. (Internal MR 1518)
|
||||
- Reserve additional extension number for Oculus. (Internal MR 1517)
|
||||
|
||||
## OpenXR 1.0.1 release (2-August-2019)
|
||||
|
||||
Patch release for the 1.0 series.
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
SHELL = /bin/bash
|
||||
SHELL = /usr/bin/env bash
|
||||
|
||||
QUIET ?= @
|
||||
PYTHON ?= python3
|
||||
|
@ -43,7 +43,7 @@ ifneq (,$(strip $(VERY_STRICT)))
|
|||
ASCIIDOC := $(ASCIIDOC) --failure-level WARN
|
||||
endif
|
||||
|
||||
SPECREVISION = 1.0.1
|
||||
SPECREVISION = 1.0.2
|
||||
REVISION_COMPONENTS = $(subst ., ,$(SPECREVISION))
|
||||
MAJORMINORVER = $(word 1,$(REVISION_COMPONENTS)).$(word 2,$(REVISION_COMPONENTS))
|
||||
|
||||
|
@ -446,9 +446,9 @@ CHECK_SPEC_LINKS_SCRIPT = $(CURDIR)/scripts/check_spec_links.py --ignore_count=0
|
|||
# Would like those in the logs, but not in the count for build-breaking.
|
||||
|
||||
check-spec-links:
|
||||
$(QUIET)$(CHECK_MARKUP_SCRIPT) -Werror
|
||||
$(QUIET)$(PYTHON) $(CHECK_SPEC_LINKS_SCRIPT)
|
||||
$(QUIET)env FAIL_IF_COULD_NOT_VALIDATE=false ./checkXml.sh
|
||||
$(QUIET)if [ -f $(CHECK_MARKUP_SCRIPT) ] && [ -f $(SPECSRC) ]; then $(CHECK_MARKUP_SCRIPT) -Werror; fi
|
||||
$(QUIET)if [ -f $(SPECSRC) ]; then $(PYTHON) $(CHECK_SPEC_LINKS_SCRIPT); fi
|
||||
$(QUIET)if [ -f checkXml.sh ] && [ -f registry/registry.rnc ]; then env FAIL_IF_COULD_NOT_VALIDATE=false ./checkXml.sh; fi
|
||||
$(QUIET)$(PYTHON) $(SCRIPTS)/xml_consistency.py
|
||||
|
||||
.PHONY: check-spec-links
|
||||
|
|
|
@ -6,4 +6,4 @@
|
|||
# Similarly, update specification/scripts/extensionmetadocgenerator.py as well.
|
||||
MAJOR=1
|
||||
MINOR=0
|
||||
PATCH=1
|
||||
PATCH=2
|
||||
|
|
|
@ -112,7 +112,7 @@ maintained in the master branch of the Khronos OpenXR GitHub project.
|
|||
updates them automatically by processing a line at a time.
|
||||
-->
|
||||
<type category="define">// OpenXR current version number.
|
||||
#define <name>XR_CURRENT_API_VERSION</name> <type>XR_MAKE_VERSION</type>(1, 0, 1)</type>
|
||||
#define <name>XR_CURRENT_API_VERSION</name> <type>XR_MAKE_VERSION</type>(1, 0, 2)</type>
|
||||
|
||||
<!--
|
||||
NOTE: For avoidance of ambiguity, there should only be 1 <name> tag immediately in
|
||||
|
@ -978,7 +978,7 @@ maintained in the master branch of the Khronos OpenXR GitHub project.
|
|||
<member><type>XrTime</type> <name>time</name></member>
|
||||
</type>
|
||||
<type category="struct" name="XrSpatialAnchorSpaceCreateInfoMSFT">
|
||||
<member values="XR_TYPE_ACTION_SPACE_CREATE_INFO"><type>XrStructureType</type> <name>type</name></member>
|
||||
<member values="XR_TYPE_SPATIAL_ANCHOR_SPACE_CREATE_INFO_MSFT"><type>XrStructureType</type> <name>type</name></member>
|
||||
<member>const <type>void</type>* <name>next</name></member>
|
||||
<member><type>XrSpatialAnchorMSFT</type> <name>anchor</name></member>
|
||||
<member><type>XrPosef</type> <name>poseInAnchorSpace</name></member>
|
||||
|
@ -2344,5 +2344,20 @@ maintained in the master branch of the Khronos OpenXR GitHub project.
|
|||
<enum value=""XR_MND_headless"" name="XR_MND_HEADLESS_EXTENSION_NAME"/>
|
||||
</require>
|
||||
</extension>
|
||||
|
||||
<extension name="XR_OCULUS_extension_44" number="44" type="instance" supported="disabled">
|
||||
<require>
|
||||
<enum value="1" name="XR_OCULUS_extension_44_SPEC_VERSION"/>
|
||||
<enum value=""XR_OCULUS_extension_44"" name="XR_OCULUS_EXTENSION_44_EXTENSION_NAME"/>
|
||||
</require>
|
||||
</extension>
|
||||
|
||||
<extension name="XR_OCULUS_android_session_state_enable" number="45" type="instance" supported="openxr">
|
||||
<require>
|
||||
<enum value="1" name="XR_OCULUS_android_session_state_enable_SPEC_VERSION"/>
|
||||
<enum value=""XR_OCULUS_android_session_state_enable"" name="XR_OCULUS_ANDROID_SESSION_STATE_ENABLE_EXTENSION_NAME"/>
|
||||
</require>
|
||||
</extension>
|
||||
|
||||
</extensions>
|
||||
</registry>
|
||||
|
|
|
@ -30,6 +30,7 @@ from reg import Registry
|
|||
import xrapi as api
|
||||
from xrconventions import OpenXRConventions as APIConventions
|
||||
|
||||
|
||||
def makeExtensionInclude(name):
|
||||
"""Return an include command, given an extension name."""
|
||||
return 'include::{}/refpage.{}{}[]'.format(
|
||||
|
@ -37,6 +38,7 @@ def makeExtensionInclude(name):
|
|||
name,
|
||||
conventions.file_suffix)
|
||||
|
||||
|
||||
def isextension(name):
|
||||
"""Return True if name is an API extension name (ends with an upper-case
|
||||
author ID).
|
||||
|
@ -44,18 +46,18 @@ def isextension(name):
|
|||
This assumes that author IDs are at least two characters."""
|
||||
return name[-2:].isalpha() and name[-2:].isupper()
|
||||
|
||||
|
||||
def printCopyrightSourceComments(fp):
|
||||
"""Print Khronos CC-BY copyright notice on open file fp.
|
||||
|
||||
If comment is True, print as an asciidoc comment block, which copyrights the source
|
||||
file. Otherwise print as an asciidoc include of the copyright in markup,
|
||||
which copyrights the outputs. Also include some asciidoc boilerplate
|
||||
needed by all the standalone ref pages."""
|
||||
Writes an asciidoc comment block, which copyrights the source
|
||||
file."""
|
||||
print('// Copyright (c) 2014-2019 Khronos Group. This work is licensed under a', file=fp)
|
||||
print('// Creative Commons Attribution 4.0 International License; see', file=fp)
|
||||
print('// http://creativecommons.org/licenses/by/4.0/', file=fp)
|
||||
print('', file=fp)
|
||||
|
||||
|
||||
def printFooter(fp):
|
||||
print('include::footer.txt[]', file=fp)
|
||||
print('', file=fp)
|
||||
|
@ -86,6 +88,7 @@ def macroPrefix(name):
|
|||
return 'No cross-references are available'
|
||||
return 'reflink:' + name
|
||||
|
||||
|
||||
def seeAlsoList(apiName, explicitRefs=None):
|
||||
"""Return an asciidoc string with a list of 'See Also' references for the
|
||||
API entity 'apiName', based on the relationship mapping in the api module.
|
||||
|
@ -110,6 +113,7 @@ def seeAlsoList(apiName, explicitRefs = None):
|
|||
return None
|
||||
return ', '.join(macroPrefix(name) for name in sorted(refs.keys())) + '\n'
|
||||
|
||||
|
||||
def remapIncludes(lines, baseDir, specDir):
|
||||
"""Remap include directives in a list of lines so they can be extracted to a
|
||||
different directory.
|
||||
|
@ -214,6 +218,8 @@ def refPageHead(pageName, pageDesc, specText, fieldName, fieldText, descText, fp
|
|||
# specAnchor is None or the 'anchor' attribute from the refpage open block,
|
||||
# identifying the anchor in the specification this refpage links to. If
|
||||
# None, the pageName is assumed to be a valid anchor.
|
||||
|
||||
|
||||
def refPageTail(pageName,
|
||||
specType=None,
|
||||
specAnchor=None,
|
||||
|
@ -230,9 +236,8 @@ def refPageTail(pageName,
|
|||
seeAlso = 'No cross-references are available\n'
|
||||
|
||||
notes = [
|
||||
'For more information, see the ' + specName + ' Specification at URL',
|
||||
'',
|
||||
'{}#{}'.format(specURL, specAnchor),
|
||||
'For more information, see the {}#{}[{} Specification^]'.format(
|
||||
specURL, specAnchor, specName),
|
||||
'',
|
||||
]
|
||||
|
||||
|
@ -263,6 +268,7 @@ def refPageTail(pageName,
|
|||
|
||||
printFooter(fp)
|
||||
|
||||
|
||||
def emitPage(baseDir, specDir, pi, file):
|
||||
"""Extract a single reference page into baseDir.
|
||||
|
||||
|
@ -323,7 +329,7 @@ def emitPage(baseDir, specDir, pi, file):
|
|||
|
||||
# Substitute xrefs to point at the main spec
|
||||
specLinksPattern = re.compile(r'<<([^>,]+)[,]?[ \t\n]*([^>,]*)>>')
|
||||
specLinksSubstitute = r'link:{}#\1[\2]'.format(specURL)
|
||||
specLinksSubstitute = r'link:{}#\1[\2^]'.format(specURL)
|
||||
if specText is not None:
|
||||
specText, _ = specLinksPattern.subn(specLinksSubstitute, specText)
|
||||
if fieldText is not None:
|
||||
|
@ -346,6 +352,7 @@ def emitPage(baseDir, specDir, pi, file):
|
|||
auto=False)
|
||||
fp.close()
|
||||
|
||||
|
||||
def autoGenEnumsPage(baseDir, pi, file):
|
||||
"""Autogenerate a single reference page in baseDir.
|
||||
|
||||
|
@ -454,6 +461,7 @@ def autoGenFlagsPage(baseDir, flagName):
|
|||
auto=True)
|
||||
fp.close()
|
||||
|
||||
|
||||
def autoGenHandlePage(baseDir, handleName):
|
||||
"""Autogenerate a single handle page in baseDir for an API handle type.
|
||||
|
||||
|
@ -493,6 +501,7 @@ def autoGenHandlePage(baseDir, handleName):
|
|||
auto=True)
|
||||
fp.close()
|
||||
|
||||
|
||||
def genRef(specFile, baseDir):
|
||||
"""Extract reference pages from a spec asciidoc source file.
|
||||
|
||||
|
@ -514,6 +523,7 @@ def genRef(specFile, baseDir):
|
|||
fixupRefs(pageMap, specFile, file)
|
||||
|
||||
# Create each page, if possible
|
||||
pages = {}
|
||||
|
||||
for name in sorted(pageMap):
|
||||
pi = pageMap[name]
|
||||
|
@ -533,6 +543,13 @@ def genRef(specFile, baseDir):
|
|||
# Don't extract this page
|
||||
logWarn('genRef: Cannot extract or autogenerate:', pi.name)
|
||||
|
||||
pages[pi.name] = pi
|
||||
for alias in pi.alias.split():
|
||||
pages[alias] = pi
|
||||
|
||||
return pages
|
||||
|
||||
|
||||
def genSinglePageRef(baseDir):
|
||||
"""Generate baseDir/apispec.txt, the single-page version of the ref pages.
|
||||
|
||||
|
@ -703,6 +720,12 @@ if __name__ == '__main__':
|
|||
parser.add_argument('-extension', action='append',
|
||||
default=[],
|
||||
help='Specify an extension or extensions to add to targets')
|
||||
parser.add_argument('-rewrite', action='store',
|
||||
default=None,
|
||||
help='Name of output file to write Apache mod_rewrite directives to')
|
||||
parser.add_argument('-toc', action='store',
|
||||
default=None,
|
||||
help='Name of output file to write an alphabetical TOC to')
|
||||
parser.add_argument('-registry', action='store',
|
||||
default=conventions.registry_path,
|
||||
help='Use specified registry file instead of default')
|
||||
|
@ -715,8 +738,12 @@ if __name__ == '__main__':
|
|||
|
||||
baseDir = results.baseDir
|
||||
|
||||
# Dictionary of pages & aliases
|
||||
pages = {}
|
||||
|
||||
for file in results.files:
|
||||
genRef(file, baseDir)
|
||||
d = genRef(file, baseDir)
|
||||
pages.update(d)
|
||||
|
||||
# Now figure out which pages *weren't* generated from the spec.
|
||||
# This relies on the dictionaries of API constructs in the api module.
|
||||
|
@ -785,3 +812,57 @@ if __name__ == '__main__':
|
|||
logWarn('No ref page generated for ', title, page)
|
||||
|
||||
genSinglePageRef(baseDir)
|
||||
|
||||
if results.rewrite:
|
||||
# Generate Apache rewrite directives for refpage aliases
|
||||
fp = open(results.rewrite, 'w', encoding='utf-8')
|
||||
|
||||
for page in sorted(pages):
|
||||
p = pages[page]
|
||||
rewrite = p.name
|
||||
|
||||
if page != rewrite:
|
||||
print('RewriteRule ^', page, '.html$ ', rewrite, '.html',
|
||||
sep='', file=fp)
|
||||
fp.close()
|
||||
|
||||
if results.toc:
|
||||
# Generate dynamic portion of refpage TOC
|
||||
fp = open(results.toc, 'w', encoding='utf-8')
|
||||
|
||||
# Run through dictionary of pages generating an TOC
|
||||
print(12 * ' ', '<li class="Level1">Alphabetic Contents', sep='', file=fp)
|
||||
print(16 * ' ', '<ul class="Level2">', sep='', file=fp)
|
||||
lastLetter = None
|
||||
|
||||
for page in sorted(pages, key=str.upper):
|
||||
p = pages[page]
|
||||
letter = page[0:1].upper()
|
||||
|
||||
if letter != lastLetter:
|
||||
if lastLetter:
|
||||
# End previous block
|
||||
print(24 * ' ', '</ul>', sep='', file=fp)
|
||||
print(20 * ' ', '</li>', sep='', file=fp)
|
||||
# Start new block
|
||||
print(20 * ' ', '<li>', letter, sep='', file=fp)
|
||||
print(24 * ' ', '<ul class="Level3">', sep='', file=fp)
|
||||
lastLetter = letter
|
||||
|
||||
# Add this page to the list
|
||||
print(28 * ' ', '<li><a href="', p.name, '.html" ',
|
||||
'target="pagedisplay">', page, '</a></li>',
|
||||
sep='', file=fp)
|
||||
|
||||
if lastLetter:
|
||||
# Close the final letter block
|
||||
print(24 * ' ', '</ul>', sep='', file=fp)
|
||||
print(20 * ' ', '</li>', sep='', file=fp)
|
||||
|
||||
# Close the list
|
||||
print(16 * ' ', '</ul>', sep='', file=fp)
|
||||
print(12 * ' ', '</li>', sep='', file=fp)
|
||||
|
||||
# print('name {} -> page {}'.format(page, pages[page].name))
|
||||
|
||||
fp.close()
|
||||
|
|
|
@ -22,7 +22,10 @@ import os
|
|||
import pdb
|
||||
import re
|
||||
import sys
|
||||
try:
|
||||
from pathlib import Path
|
||||
except ImportError:
|
||||
from pathlib2 import Path
|
||||
|
||||
from spec_tools.util import getElemName, getElemType
|
||||
|
||||
|
@ -671,7 +674,8 @@ class OutputGenerator:
|
|||
- param - Element (`<param>` or `<member>`) to format
|
||||
- aligncol - if non-zero, attempt to align the nested `<name>` element
|
||||
at this column"""
|
||||
paramdecl = ' ' + noneStr(param.text)
|
||||
indent = ' '
|
||||
paramdecl = indent + noneStr(param.text)
|
||||
for elem in param:
|
||||
text = noneStr(elem.text)
|
||||
tail = noneStr(elem.tail)
|
||||
|
@ -691,6 +695,9 @@ class OutputGenerator:
|
|||
newLen = len(paramdecl)
|
||||
self.logMsg('diag', 'Adjust length of parameter decl from', oldLen, 'to', newLen, ':', paramdecl)
|
||||
paramdecl += text + tail
|
||||
if aligncol == 0:
|
||||
# Squeeze out multiple spaces other than the indentation
|
||||
paramdecl = indent + ' '.join(paramdecl.split())
|
||||
return paramdecl
|
||||
|
||||
def getCParamTypeLength(self, param):
|
||||
|
@ -866,6 +873,12 @@ class OutputGenerator:
|
|||
else:
|
||||
pdecl += text + tail
|
||||
tdecl += text + tail
|
||||
|
||||
if self.genOpts.alignFuncParam == 0:
|
||||
# Squeeze out multiple spaces - there is no indentation
|
||||
pdecl = ' '.join(pdecl.split())
|
||||
tdecl = ' '.join(tdecl.split())
|
||||
|
||||
# Now add the parameter declaration list, which is identical
|
||||
# for prototypes and typedefs. Concatenate all the text from
|
||||
# a <param> node without the tags. No tree walking required
|
||||
|
|
|
@ -16,6 +16,7 @@
|
|||
|
||||
import argparse
|
||||
import os
|
||||
import pdb
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
|
@ -45,6 +46,7 @@ def startTimer(timeit):
|
|||
if timeit:
|
||||
startTime = time.process_time()
|
||||
|
||||
|
||||
def endTimer(timeit, msg):
|
||||
global startTime
|
||||
if timeit:
|
||||
|
@ -52,12 +54,16 @@ def endTimer(timeit, msg):
|
|||
write(msg, endTime - startTime, file=sys.stderr)
|
||||
startTime = None
|
||||
|
||||
def makeREstring(strings, default=None):
|
||||
|
||||
def makeREstring(strings, default=None, strings_are_regex=False):
|
||||
"""Turn a list of strings into a regexp string matching exactly those strings."""
|
||||
if strings or default is None:
|
||||
return '^(' + '|'.join((re.escape(s) for s in strings)) + ')$'
|
||||
if not strings_are_regex:
|
||||
strings = (re.escape(s) for s in strings)
|
||||
return '^(' + '|'.join(strings) + ')$'
|
||||
return default
|
||||
|
||||
|
||||
def makeGenOpts(args):
|
||||
"""Returns a directory of [ generator function, generator options ] indexed
|
||||
by specified short names. The generator options incorporate the following
|
||||
|
@ -449,6 +455,9 @@ if __name__ == '__main__':
|
|||
tree = etree.parse(args.registry)
|
||||
endTimer(args.time, '* Time to make ElementTree =')
|
||||
|
||||
if args.debug:
|
||||
pdb.run('reg.loadElementTree(tree)')
|
||||
else:
|
||||
startTimer(args.time)
|
||||
reg.loadElementTree(tree)
|
||||
endTimer(args.time, '* Time to parse ElementTree =')
|
||||
|
@ -472,7 +481,6 @@ if __name__ == '__main__':
|
|||
diag = None
|
||||
|
||||
if args.debug:
|
||||
import pdb
|
||||
pdb.run('genTarget(args)')
|
||||
elif args.profile:
|
||||
import cProfile
|
||||
|
|
|
@ -13,22 +13,21 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
"""Generate a mapping of extension name -> all required extension names for that extension.
|
||||
|
||||
# make_ext_dependency - generate a mapping of extension name -> all required
|
||||
# extension names for that extension.
|
||||
This script generates a list of all extensions, and of just KHR
|
||||
extensions, that are placed into a Bash script and/or Python script. This
|
||||
script can then be sources or executed to set a variable (e.g. khrExts),
|
||||
Frontend scripts such as 'makeAllExts' and 'makeKHR' use this information
|
||||
to set the EXTENSIONS Makefile variable when building the spec.
|
||||
|
||||
# This script generates a list of all extensions, and of just KHR
|
||||
# extensions, that are placed into a Bash script and/or Python script. This
|
||||
# script can then be sources or executed to set a variable (e.g. khrExts),
|
||||
# Frontend scripts such as 'makeAllExts' and 'makeKHR' use this information
|
||||
# to set the EXTENSIONS Makefile variable when building the spec.
|
||||
#
|
||||
# Sample Usage:
|
||||
#
|
||||
# python3 scripts/make_ext_dependency.py -outscript=temp.sh
|
||||
# source temp.sh
|
||||
# make EXTENSIONS="$khrExts" html
|
||||
# rm temp.sh
|
||||
Sample Usage:
|
||||
|
||||
python3 scripts/make_ext_dependency.py -outscript=temp.sh
|
||||
source temp.sh
|
||||
make EXTENSIONS="$khrExts" html
|
||||
rm temp.sh
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import errno
|
||||
|
@ -41,6 +40,7 @@ from xrconventions import OpenXRConventions as APIConventions
|
|||
def enQuote(key):
|
||||
return "'" + str(key) + "'"
|
||||
|
||||
|
||||
def shList(names):
|
||||
"""Return a sortable (list or set) of names as a string encoding
|
||||
of a Bash or Python list, sorted on the names."""
|
||||
|
@ -49,12 +49,14 @@ def shList(names):
|
|||
'"')
|
||||
return s
|
||||
|
||||
|
||||
def pyList(names):
|
||||
s = ('[ ' +
|
||||
', '.join(enQuote(key) for key in sorted(names)) +
|
||||
' ]')
|
||||
return s
|
||||
|
||||
|
||||
class DiGraph:
|
||||
"""A directed graph.
|
||||
|
||||
|
@ -121,12 +123,14 @@ class DiGraph:
|
|||
seen.add(y)
|
||||
visit_me.append(y)
|
||||
|
||||
|
||||
class DiGraphNode:
|
||||
|
||||
def __init__(self):
|
||||
# Set of adjacent of nodes.
|
||||
self.adj = set()
|
||||
|
||||
|
||||
def make_dir(fn):
|
||||
outdir = Path(fn).parent
|
||||
try:
|
||||
|
@ -135,6 +139,7 @@ def make_dir(fn):
|
|||
if e.errno != errno.EEXIST:
|
||||
raise
|
||||
|
||||
|
||||
# API conventions object
|
||||
conventions = APIConventions()
|
||||
|
||||
|
@ -198,7 +203,7 @@ if __name__ == '__main__':
|
|||
fp = open(args.outscript, 'w', encoding='utf-8')
|
||||
|
||||
print('#!/bin/bash', file=fp)
|
||||
print('# Generated from extDependency.py', file=fp)
|
||||
print('# Generated from make_ext_dependency.py', file=fp)
|
||||
print('# Specify maps of all extensions required by an enabled extension', file=fp)
|
||||
print('', file=fp)
|
||||
print('declare -A extensions', file=fp)
|
||||
|
@ -225,7 +230,7 @@ if __name__ == '__main__':
|
|||
fp = open(args.outpy, 'w', encoding='utf-8')
|
||||
|
||||
print('#!/usr/bin/env python', file=fp)
|
||||
print('# Generated from extDependency.py', file=fp)
|
||||
print('# Generated from make_ext_dependency.py', file=fp)
|
||||
print('# Specify maps of all extensions required by an enabled extension', file=fp)
|
||||
print('', file=fp)
|
||||
print('extensions = {}', file=fp)
|
||||
|
@ -238,7 +243,8 @@ if __name__ == '__main__':
|
|||
|
||||
# Only emit an ifdef block if an extension has dependencies
|
||||
if children:
|
||||
print("extensions['" + ext + "'] = " + pyList(children), file=fp)
|
||||
print("extensions['" + ext + "'] = " +
|
||||
pyList(children), file=fp)
|
||||
|
||||
print('', file=fp)
|
||||
print('# Define lists of all extensions and KHR extensions', file=fp)
|
||||
|
|
|
@ -154,6 +154,9 @@ class pageInfo:
|
|||
self.end = None
|
||||
"""index of last line of the page (heuristic validity include, or // refEnd)"""
|
||||
|
||||
self.alias = ''
|
||||
"""aliases of this name, if supplied, or ''"""
|
||||
|
||||
self.refs = ''
|
||||
"""cross-references on // refEnd line, if supplied"""
|
||||
|
||||
|
@ -473,6 +476,7 @@ def findRefs(file, filename):
|
|||
refpage_type = None
|
||||
spec_type = None
|
||||
anchor = None
|
||||
alias = None
|
||||
xrefs = None
|
||||
|
||||
for (key,value) in matches:
|
||||
|
@ -487,6 +491,8 @@ def findRefs(file, filename):
|
|||
spec_type = value
|
||||
elif key == 'anchor':
|
||||
anchor = value
|
||||
elif key == 'alias':
|
||||
alias = value
|
||||
elif key == 'xrefs':
|
||||
xrefs = value
|
||||
else:
|
||||
|
@ -503,11 +509,14 @@ def findRefs(file, filename):
|
|||
pi.type = refpage_type
|
||||
pi.spec = spec_type
|
||||
pi.anchor = anchor
|
||||
if alias:
|
||||
pi.alias = alias
|
||||
if xrefs:
|
||||
pi.refs = xrefs
|
||||
logDiag('open block for', name, 'added DESC =', desc,
|
||||
'TYPE =', refpage_type, 'XREFS =', xrefs,
|
||||
'SPEC =', spec_type, 'ANCHOR =', anchor)
|
||||
'TYPE =', refpage_type, 'ALIAS =', alias,
|
||||
'XREFS =', xrefs, 'SPEC =', spec_type,
|
||||
'ANCHOR =', anchor)
|
||||
|
||||
line = line + 1
|
||||
continue
|
||||
|
|
|
@ -591,7 +591,7 @@ class Registry:
|
|||
"""Dump all the dictionaries constructed from the Registry object.
|
||||
|
||||
Diagnostic to dump the dictionaries to specified file handle (default stdout).
|
||||
Truncates type / enum / command elements to maxlen characters (default 80)"""
|
||||
Truncates type / enum / command elements to maxlen characters (default 120)"""
|
||||
write('***************************************', file=filehandle)
|
||||
write(' ** Dumping Registry contents **', file=filehandle)
|
||||
write('***************************************', file=filehandle)
|
||||
|
|
|
@ -29,7 +29,7 @@ class RecursiveMemoize:
|
|||
|
||||
"""
|
||||
|
||||
def __init__(self, key_iterable=None, permit_cycles=False):
|
||||
def __init__(self, func, key_iterable=None, permit_cycles=False):
|
||||
"""Initialize data structures, and optionally compute/cache the answer
|
||||
for all elements of an iterable.
|
||||
|
||||
|
@ -38,6 +38,7 @@ class RecursiveMemoize:
|
|||
If permit_cycles is True, then __getitem__ on something that's
|
||||
currently being computed returns None.
|
||||
"""
|
||||
self._compute = func
|
||||
self.permit_cycles = permit_cycles
|
||||
self.d = {}
|
||||
if key_iterable:
|
||||
|
@ -64,8 +65,8 @@ class RecursiveMemoize:
|
|||
|
||||
# Set sentinel for "we're computing this"
|
||||
self.d[key] = None
|
||||
# Delegate to subclass to actually compute
|
||||
ret = self.compute(key)
|
||||
# Delegate to function to actually compute
|
||||
ret = self._compute(key)
|
||||
# Memoize
|
||||
self.d[key] = ret
|
||||
|
||||
|
|
|
@ -19,10 +19,12 @@
|
|||
|
||||
import re
|
||||
|
||||
import networkx as nx
|
||||
|
||||
from .algo import RecursiveMemoize
|
||||
from .attributes import ExternSyncEntry, LengthEntry
|
||||
from .util import findNamedElem, getElemName
|
||||
from .data_structures import DictOfStringSets
|
||||
from .util import findNamedElem, getElemName
|
||||
|
||||
|
||||
class XMLChecker:
|
||||
|
@ -104,8 +106,8 @@ class XMLChecker:
|
|||
|
||||
unrecognized = specified_codes - self.return_codes
|
||||
if unrecognized:
|
||||
raise RuntimeError("Return code mentioned in script that isn't in the registry: "
|
||||
+ ', '.join(unrecognized))
|
||||
raise RuntimeError("Return code mentioned in script that isn't in the registry: " +
|
||||
', '.join(unrecognized))
|
||||
|
||||
self.referenced_input_types = ReferencedTypes(self.db, self.is_input)
|
||||
self.referenced_api_types = ReferencedTypes(self.db, self.is_api_type)
|
||||
|
@ -269,7 +271,20 @@ class XMLChecker:
|
|||
if not name.startswith(self.conventions.type_prefix):
|
||||
self.record_error("Name does not start with",
|
||||
self.conventions.type_prefix)
|
||||
self.check_params(info.elem.findall('member'))
|
||||
members = info.elem.findall('member')
|
||||
self.check_params(members)
|
||||
|
||||
# Check the structure type member, if present.
|
||||
type_member = findNamedElem(
|
||||
members, self.conventions.structtype_member_name)
|
||||
if type_member is not None:
|
||||
val = type_member.get('values')
|
||||
if val:
|
||||
expected = self.conventions.generate_structure_type_from_name(
|
||||
name)
|
||||
if val != expected:
|
||||
self.record_error("Type has incorrect type-member value: expected",
|
||||
expected, "got", val)
|
||||
|
||||
elif category == "bitmask":
|
||||
if 'Flags' not in name:
|
||||
|
@ -369,10 +384,15 @@ class XMLChecker:
|
|||
name, referenced_type)
|
||||
missing_codes = required_codes - codes
|
||||
if missing_codes:
|
||||
path = self.referenced_input_types.shortest_path(
|
||||
name, referenced_type)
|
||||
path_str = " -> ".join(path)
|
||||
self.record_error("Missing expected return code(s)",
|
||||
",".join(missing_codes),
|
||||
"implied because of input of type",
|
||||
referenced_type)
|
||||
referenced_type,
|
||||
"found via path",
|
||||
path_str)
|
||||
|
||||
# Check that, for each code returned by this command that we can
|
||||
# associate with a type, we have some type that can provide it.
|
||||
|
@ -409,7 +429,7 @@ class XMLChecker:
|
|||
"""Record failure and an error message for the current context."""
|
||||
message = " ".join((str(x) for x in args))
|
||||
|
||||
if self.entity_suppressions and message in self.entity_suppressions:
|
||||
if self._is_message_suppressed(message):
|
||||
return
|
||||
|
||||
message = self._prepend_sourceline_to_message(message, **kwargs)
|
||||
|
@ -420,12 +440,22 @@ class XMLChecker:
|
|||
"""Record a warning message for the current context."""
|
||||
message = " ".join((str(x) for x in args))
|
||||
|
||||
if self.entity_suppressions and message in self.entity_suppressions:
|
||||
if self._is_message_suppressed(message):
|
||||
return
|
||||
|
||||
message = self._prepend_sourceline_to_message(message, **kwargs)
|
||||
self.warnings.add(self.entity, message)
|
||||
|
||||
def _is_message_suppressed(self, message):
|
||||
"""Return True if the given message, for this entity, should be suppressed."""
|
||||
if not self.entity_suppressions:
|
||||
return False
|
||||
for suppress in self.entity_suppressions:
|
||||
if suppress in message:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _prepend_sourceline_to_message(self, message, **kwargs):
|
||||
"""Prepend a file and/or line reference to the message, if possible.
|
||||
|
||||
|
@ -464,10 +494,9 @@ class HandleParents(RecursiveMemoize):
|
|||
def __init__(self, handle_types):
|
||||
self.handle_types = handle_types
|
||||
|
||||
super().__init__(handle_types.keys())
|
||||
|
||||
def compute(self, handle_type):
|
||||
immediate_parent = self.handle_types[handle_type].elem.get('parent')
|
||||
def compute(handle_type):
|
||||
immediate_parent = self.handle_types[handle_type].elem.get(
|
||||
'parent')
|
||||
|
||||
if immediate_parent is None:
|
||||
# No parents, no need to recurse
|
||||
|
@ -482,6 +511,8 @@ class HandleParents(RecursiveMemoize):
|
|||
all_parents.extend(self[parent])
|
||||
return all_parents
|
||||
|
||||
super().__init__(compute, handle_types.keys())
|
||||
|
||||
|
||||
def _always_true(x):
|
||||
return True
|
||||
|
@ -501,15 +532,18 @@ class ReferencedTypes(RecursiveMemoize):
|
|||
if not self.predicate:
|
||||
# Default predicate is "anything goes"
|
||||
self.predicate = _always_true
|
||||
super().__init__(permit_cycles=True)
|
||||
|
||||
def compute(self, type_name):
|
||||
members = self.db.getMemberElems(type_name)
|
||||
if not members:
|
||||
return set()
|
||||
types = ((member, member.find("type")) for member in members)
|
||||
types = set(type_elem.text for (member, type_elem) in types
|
||||
if type_elem is not None and self.predicate(member))
|
||||
self._directly_referenced = {}
|
||||
self.graph = nx.DiGraph()
|
||||
|
||||
def compute(type_name):
|
||||
"""Compute and return all types referenced by type_name, recursively, that satisfy the predicate.
|
||||
|
||||
Called by the [] operator in the base class."""
|
||||
types = self.directly_referenced(type_name)
|
||||
if not types:
|
||||
return types
|
||||
|
||||
all_types = set()
|
||||
all_types.update(types)
|
||||
for t in types:
|
||||
|
@ -519,6 +553,37 @@ class ReferencedTypes(RecursiveMemoize):
|
|||
all_types.update(referenced)
|
||||
return all_types
|
||||
|
||||
# Initialize base class
|
||||
super().__init__(compute, permit_cycles=True)
|
||||
|
||||
def shortest_path(self, source, target):
|
||||
"""Get the shortest path between one type/function name and another."""
|
||||
# Trigger computation
|
||||
_ = self[source]
|
||||
|
||||
return nx.algorithms.shortest_path(self.graph, source=source, target=target)
|
||||
|
||||
def directly_referenced(self, type_name):
|
||||
"""Get all types referenced directly by type_name that satisfy the predicate.
|
||||
|
||||
Memoizes its results."""
|
||||
if type_name not in self._directly_referenced:
|
||||
members = self.db.getMemberElems(type_name)
|
||||
if members:
|
||||
types = ((member, member.find("type")) for member in members)
|
||||
self._directly_referenced[type_name] = set(type_elem.text for (member, type_elem) in types
|
||||
if type_elem is not None and self.predicate(member))
|
||||
|
||||
else:
|
||||
self._directly_referenced[type_name] = set()
|
||||
|
||||
# Update graph
|
||||
self.graph.add_node(type_name)
|
||||
self.graph.add_edges_from((type_name, t)
|
||||
for t in self._directly_referenced[type_name])
|
||||
|
||||
return self._directly_referenced[type_name]
|
||||
|
||||
|
||||
class HandleData:
|
||||
"""Data about all the handle types available in an API specification."""
|
||||
|
|
|
@ -32,28 +32,6 @@ def getElemType(elem, default=None):
|
|||
# Fallback if there is no child.
|
||||
return elem.get('type', default)
|
||||
|
||||
def _conditional_string_strip_append(my_list, optional_str):
|
||||
if not optional_str:
|
||||
return
|
||||
stripped = optional_str.strip()
|
||||
if not stripped:
|
||||
return
|
||||
my_list.append(stripped)
|
||||
|
||||
def getParamOrMemberFullType(elem, default=None):
|
||||
"""Get the full type associated with a member or param element.
|
||||
|
||||
This includes the text preceding, within, and following the 'type' tag."""
|
||||
parts = []
|
||||
for node in elem.iter():
|
||||
_conditional_string_strip_append(parts, node.text)
|
||||
_conditional_string_strip_append(parts, node.tail)
|
||||
if node.tag == 'type':
|
||||
return ' '.join(parts)
|
||||
|
||||
# Fallback if there is no child with a "type" tag
|
||||
return default
|
||||
|
||||
|
||||
def findFirstWithPredicate(collection, pred):
|
||||
"""Return the first element that satisfies the predicate, or None if none exist.
|
||||
|
|
|
@ -364,8 +364,8 @@ class ValidityOutputGenerator(OutputGenerator):
|
|||
|
||||
# Host Synchronization
|
||||
if threadsafety:
|
||||
# TODO the heading of this block differs between projects
|
||||
write('.Thread Safety', file=fp)
|
||||
# The heading of this block differs between projects, so an Asciidoc attribute is used.
|
||||
write('.{externsynctitle}', file=fp)
|
||||
write('****', file=fp)
|
||||
write(threadsafety, file=fp, end='')
|
||||
write('****', file=fp)
|
||||
|
@ -770,8 +770,11 @@ class ValidityOutputGenerator(OutputGenerator):
|
|||
|
||||
elif typecategory == 'bitmask':
|
||||
bitsname = paramtype.replace('Flags', 'FlagBits')
|
||||
if self.registry.tree.find("enums[@name='" + bitsname + "']") is None:
|
||||
# Empty bit mask: presumably just a placeholder (or only in an extension not enabled for this build)
|
||||
bitselem = self.registry.tree.find("enums[@name='" + bitsname + "']")
|
||||
if bitselem is None or len(bitselem.findall('enum')) == 0:
|
||||
# Empty bit mask: presumably just a placeholder (or only in
|
||||
# an extension not enabled for this build)
|
||||
|
||||
entry = ValidityEntry(
|
||||
anchor=(param_name, 'zerobitmask'))
|
||||
entry += self.makeParameterName(param_name)
|
||||
|
@ -1212,8 +1215,8 @@ class ValidityOutputGenerator(OutputGenerator):
|
|||
# See also makeThreadSafetyBlock in validitygenerator.py
|
||||
validity = self.makeValidityCollection(getElemName(cmd))
|
||||
|
||||
# This text varies between projects
|
||||
extsync_prefix = "Access to "
|
||||
# This text varies between projects, so an Asciidoctor attribute is used.
|
||||
extsync_prefix = "{externsyncprefix} "
|
||||
|
||||
# Find and add any parameters that are thread unsafe
|
||||
explicitexternsyncparams = cmd.findall(paramtext + "[@externsync]")
|
||||
|
@ -1262,6 +1265,10 @@ class ValidityOutputGenerator(OutputGenerator):
|
|||
return validity
|
||||
|
||||
def findRequiredEnums(self, enums):
|
||||
"""Check each enumerant name in the enums list and remove it if not
|
||||
required by the generator. This allows specifying success and error
|
||||
codes for extensions that are only included in validity when needed
|
||||
for the spec being targeted."""
|
||||
return self.keepOnlyRequired(enums, self.registry.enumdict)
|
||||
|
||||
def findRequiredCommands(self, commands):
|
||||
|
|
|
@ -235,7 +235,7 @@ class Checker(XMLChecker):
|
|||
param_type = getElemType(param_elem)
|
||||
if param_type != 'uint32_t':
|
||||
self.record_error('Two-call-idiom call has count parameter', param_name,
|
||||
'with type', input_type, 'instead of uint32_t')
|
||||
'with type', param_type, 'instead of uint32_t')
|
||||
type_elem = param_elem.find('type')
|
||||
assert(type_elem is not None)
|
||||
|
||||
|
@ -295,7 +295,6 @@ class Checker(XMLChecker):
|
|||
def check_two_call_command(self, name, info, params):
|
||||
"""Check a two-call-idiom command."""
|
||||
named_params = [(getElemName(p), p) for p in params]
|
||||
param_indices = {getElemName(p): i for i, p in enumerate(params)}
|
||||
|
||||
# Find the three important parameters
|
||||
capacity_input_param_name = None
|
||||
|
@ -359,6 +358,14 @@ class Checker(XMLChecker):
|
|||
"""Check a command's XML data for consistency.
|
||||
|
||||
Called from check."""
|
||||
t = info.elem.find('proto/type')
|
||||
if t is None:
|
||||
self.record_warning("Got a command without a return type?")
|
||||
else:
|
||||
return_type = t.text
|
||||
if return_type != 'XrResult':
|
||||
self.record_error("Got a function that returned", return_type,
|
||||
"instead of XrResult - some scripts/software assume all functions return XrResult!")
|
||||
params = info.elem.findall('./param')
|
||||
for p in params:
|
||||
param_name = getElemName(p)
|
||||
|
@ -409,6 +416,7 @@ class Checker(XMLChecker):
|
|||
else:
|
||||
fn = get_extension_source(name)
|
||||
revisions = []
|
||||
try:
|
||||
with open(fn, 'r', encoding='utf-8') as fp:
|
||||
for line in fp:
|
||||
line = line.rstrip()
|
||||
|
@ -430,6 +438,9 @@ class Checker(XMLChecker):
|
|||
self.record_error("Cannot find version history in spec text, but XML reports a non-1 version number", ver_from_xml,
|
||||
" - make sure the spec text has lines starting exactly like '* Revision 1, ....'",
|
||||
filename=fn)
|
||||
except FileNotFoundError:
|
||||
# This is OK: just means we can't check against the spec text.
|
||||
pass
|
||||
|
||||
name_define = "{}_EXTENSION_NAME".format(name_upper)
|
||||
name_elem = findNamedElem(enums, name_define)
|
||||
|
|
|
@ -27,7 +27,7 @@ from conventions import ConventionsBase
|
|||
# Note that while Vulkan is listed as a special case,
|
||||
# it doesn't actually break the rules, so it's not treated specially
|
||||
MAIN_RE = re.compile(
|
||||
r'(?P<d3d>D3D[0-9]*)|(?P<gl>OpenGL(ES)?)|(?P<word>([A-Z][a-z]+)|([A-Z][A-Z0-9]+))')
|
||||
r'(?P<d3d>D3D[0-9]*)|(?P<gl>OpenGL(ES)?)|(?P<word>([A-Z][a-z]+[a-z0-9]*)|([A-Z][A-Z0-9]+))')
|
||||
|
||||
|
||||
class OpenXRConventions(ConventionsBase):
|
||||
|
|
|
@ -60,3 +60,8 @@ endif::backend-html5[]
|
|||
ifndef::backend-html5[]
|
||||
:wbro:
|
||||
endif::backend-html5[]
|
||||
|
||||
|
||||
// Placeholders for synchronization block text
|
||||
:externsynctitle: Thread Safety
|
||||
:externsyncprefix: Access to
|
||||
|
|
|
@ -1,308 +0,0 @@
|
|||
[[extension-processes]]
|
||||
== Extension Processes
|
||||
|
||||
To help understand how OpenXR extensions are created and supported, this
|
||||
section of the spec will discuss the various processes for creating,
|
||||
supporting, and retiring extensions.
|
||||
|
||||
=== Extension Types
|
||||
|
||||
As mentioned in the <<extensions, Extensions>> section, OpenXR extensions
|
||||
can take the form of:
|
||||
|
||||
* Khronos Extensions : Containing "KHR_" in their name
|
||||
* Multi-vendor Extensions : Identified by the "EXT_" in their name
|
||||
* Vendor-specific Extensions
|
||||
* Examples could include:
|
||||
* ARM - XR_ARM_...
|
||||
* Google - XR_GOOGLE_...
|
||||
* Oculus - Desktop: XR_OCULUS_...
|
||||
|
||||
[NOTE]
|
||||
.Note
|
||||
====
|
||||
All vendor IDs are clearly identified in the OpenXR registry (xr.xml)
|
||||
====
|
||||
|
||||
|
||||
=== Typical Extension Process Flow
|
||||
|
||||
The typical extension process can be best described in the following way:
|
||||
|
||||
One company thinks of a new feature and creates their own vendor-specific
|
||||
extension.
|
||||
At some point, if they desire, the company reveals the extension to members
|
||||
of the OpenXR Working Group (WG).
|
||||
|
||||
If multiple companies agree on the design, they can choose to create an
|
||||
`EXT` extension that will be supported by multiple companies.
|
||||
When creating the `EXT` extension, separate discussions must: occur outside
|
||||
of the Khronos IP-Zone.
|
||||
|
||||
If a majority of companies within the working group desire to cooperate on
|
||||
making the functionality more consistent across the OpenXR API, they can
|
||||
discuss creating a new version of the extension as a Khronos extension with
|
||||
the `KHR` prefix.
|
||||
|
||||
[NOTE]
|
||||
.Note
|
||||
====
|
||||
The originating company could still release their original vendor-specific
|
||||
extension if they desire to get it out in a timely manner.
|
||||
If everyone likes the direction, they can create a KHR extension.
|
||||
====
|
||||
|
||||
Sometimes, multiple companies may come up with differing ideas of how to
|
||||
implement a given feature.
|
||||
Often, this will result in multiple vendor-specific extensions.
|
||||
It is preferable to cooperate and create either `EXT` or `KHR` extensions
|
||||
whenever possible since developers prefer to use common extensions.
|
||||
|
||||
It is also possible that the OpenXR working group could create a new
|
||||
extension without any precedence.
|
||||
When this occurs, the extension is released directly as a `KHR` extension.
|
||||
|
||||
A graphical view of the extension discussion flow can be seen in the
|
||||
following picture:
|
||||
|
||||
image:images/extension_discussion_flow.png[ "Extension Discussion
|
||||
FLow",width=1024]
|
||||
|
||||
|
||||
==== Creating Extensions During Khronos Discussions
|
||||
|
||||
As shown in the above image, Khronos extensions belong in the Khronos IP
|
||||
zone and all other extension types exist outside of the Khronos IP zone.
|
||||
Khronos extensions, therefore, may be freely discussed and designed at any
|
||||
time within Khronos.
|
||||
However, because all other extensions are outside of the Khronos IP zone,
|
||||
detailed design discussions within Khronos of these extensions should: occur
|
||||
outside of Khronos email lists and/or meeting times.
|
||||
If a non-KHR extension's design is discussed during any of these, the
|
||||
discussion must: be clearly segregated and preceded by an indication that
|
||||
the following discussion is on an extension that will not be part of Khronos
|
||||
IP.
|
||||
It is preferable, to avoid talking about those types of extensions within
|
||||
Khronos whenever possible.
|
||||
|
||||
|
||||
=== Extension Registration Process
|
||||
|
||||
Each extension has a unique number used to identify it within the OpenXR
|
||||
API.
|
||||
Since there can be multiple authors working on OpenXR extensions
|
||||
simultaneously, the extension author must: first register the extension with
|
||||
the OpenXR Working Group.
|
||||
When registering an extension, the author only provides a minimal amount of
|
||||
information about their extension to help avoid future spec conflicts.
|
||||
|
||||
The process is detailed in the OpenXR Styleguide under the "Registering
|
||||
Extensions" section.
|
||||
Please follow this process when creating extensions to avoid extension merge
|
||||
collisions.
|
||||
|
||||
[WARNING]
|
||||
.TODO (i/769)
|
||||
====
|
||||
Provide link to styleguide?
|
||||
====
|
||||
|
||||
The rough process for registering an extension can be split into several
|
||||
steps to accommodate extension number assignment prior to extension
|
||||
publication:
|
||||
|
||||
1. Acquire an extension number.
|
||||
* This is done by proposing a merge request against xr.xml.
|
||||
* The merge should add a new <extension> tag at the end of the file with
|
||||
attributes specifying the proposed extension name, the next unused
|
||||
sequential extension number, the author and contact information (if
|
||||
different than that already specified for the author ID used in the
|
||||
extension name), and finally, disabling the extension using
|
||||
`supported="disabled"` instead of `supported="openxr"`.
|
||||
* The extension number will be reserved only once this merge request is
|
||||
accepted.
|
||||
2. Develop and test the extension using the registered extension number.
|
||||
3. Create a second merge request with the completed extension using the
|
||||
previously registered extension number, and submit it to the appropriate
|
||||
repository storing the OpenXR code.
|
||||
* This should include all necessary changes to the OpenXR specification,
|
||||
the OpenXR registry file (xr.xml) and any test or example source in the
|
||||
tree that is affected by the change.
|
||||
|
||||
|
||||
=== Extension Approval Process
|
||||
|
||||
The extension approval process starts when a completed extension's merge
|
||||
request has been properly submitted.
|
||||
The approval process is important since no extension can be merged into the
|
||||
OpenXR API Specification until it has been approved by the appropriate
|
||||
members/companies.
|
||||
The extension approval process does vary based upon what type of extension
|
||||
is being submitted, and the differences are pointed out in the following
|
||||
sections.
|
||||
|
||||
|
||||
==== KHR Extension Approval Process
|
||||
|
||||
`KHR` extensions are a special case in the extension approval process
|
||||
because they must be approved by a majority of the Working Group members.
|
||||
A `KHR` extension must be developed in full view and with the participation
|
||||
of the Khronos OpenXR Working Group (WG).
|
||||
The development of the extension may: occur initially through the use of one
|
||||
or more OpenXR mailing lists, but must: eventually be discussed during
|
||||
either a Technical Sub-Group (TSG) conference call, the main Working Group
|
||||
conference call, or at a Khronos Face-to-Face.
|
||||
This required visibility is to provide sufficient time for members to
|
||||
provide their own input as well as evaluate any potential Intellectual
|
||||
Property (IP) concerns prior to an approval vote.
|
||||
|
||||
Typically, one member of the OpenXR WG, or one of the OpenXR TSGs, will
|
||||
volunteer to `champion` the extension.
|
||||
The `champion` is required to document the extension and all concerns as
|
||||
well as create the final Merge Request to integrate the extension into the
|
||||
appropriate branch.
|
||||
Often, the appropriate branch will be the master branch, but this may vary.
|
||||
This `champion` should work with all companies interested and address their
|
||||
concerns about the extension when creating the final merge request.
|
||||
|
||||
After the appropriate members feel that the extension has progressed enough,
|
||||
the `champion` must: submit the merge request to the Working Group for
|
||||
approval.
|
||||
The merge request must: conform to the requirements identified in the
|
||||
sections above regarding all extension related including the specification
|
||||
section formatting and contents of the registry (xr.xml).
|
||||
|
||||
Once the Working group approves of the merge request and the extension, it
|
||||
must: be submitted to the appropriate group within Khronos for final review
|
||||
and approval before it can be made available to the public.
|
||||
|
||||
|
||||
[WARNING]
|
||||
.TODO (i/769)
|
||||
====
|
||||
Is providing a Conformance test a requirement for `KHR` extension approval?
|
||||
====
|
||||
|
||||
|
||||
==== EXT Extension Approval Process
|
||||
|
||||
The process for accepting `EXT` extensions is different than that of
|
||||
accepting `KHR` extensions.
|
||||
|
||||
Unlike the `KHR` extension approval process:
|
||||
|
||||
1. Only two or more companies need to be involved, and only one of them
|
||||
needs to be in the Working Group.
|
||||
2. The IP is not considered to be Khronos IP, and so discussions during any
|
||||
Working Group or Technical Sub-Group time must: be preceded by the
|
||||
appropriate disclaimers.
|
||||
3. Only the companies participating in the development of the `EXT`
|
||||
extension need to approve of it prior to creating a merge request.
|
||||
* This approval may be determined in whatever method the participating
|
||||
companies feel is appropriate.
|
||||
4. The champion still must create the merge request against the appropriate
|
||||
branch.
|
||||
* In this case, only the participating companies are responsible for
|
||||
approving the merge request before review by the appropriate OpenXR
|
||||
Specification editor.
|
||||
5. To indicate that the `EXT` extension has been approved by the appropriate
|
||||
members, the champion, or someone he designates, will submit the merge
|
||||
request to the Working Group for approval and indicate that it has been
|
||||
approved by all relevant members.
|
||||
6. The Specification editor must: review the merge request and ensure that:
|
||||
* CI does not failed for any reason due to the changes within the merge
|
||||
request.
|
||||
* There are no pending conflicts.
|
||||
* The reviewer may: also perform a quick check of the correctness of the
|
||||
specification and registry (xr.xml) changes.
|
||||
7. Once the editor is satisfied with their simple review, they (or someone
|
||||
approved by them to perform a merge) must: merge the merge request into
|
||||
the appropriate branch within a reasonable amount of time (typically
|
||||
within one or two weeks).
|
||||
|
||||
|
||||
==== Vendor-Specific Extension Approval Process
|
||||
|
||||
Similar to an `EXT` extension, vendor-specific extensions should: not use
|
||||
OpenXR WG or TSG time and may: be done entirely within the domain of the
|
||||
company creating the extension.
|
||||
|
||||
For vendor-specific extensions, the main process is:
|
||||
|
||||
1. The company creating the extension identifies a `champion` to write up
|
||||
the extension merge request.
|
||||
2. The company determines when they are ready to submit the merge request to
|
||||
the OpenXR Specification editor submitting an merge request with
|
||||
completed extension's changes.
|
||||
3. The Specification editor must: review the merge request and ensure that:
|
||||
* CI does not failed for any reason due to the changes within the merge
|
||||
request.
|
||||
* There are no pending conflicts.
|
||||
* The reviewer may: also perform a quick check of the correctness of the
|
||||
specification and registry (xr.xml) changes.
|
||||
4. Once the editor is satisfied with their simple review, they (or someone
|
||||
approved by them to perform a merge) must: merge the merge request into
|
||||
the appropriate branch within a reasonable amount of time (typically
|
||||
within one or two weeks).
|
||||
|
||||
[NOTE]
|
||||
.Note
|
||||
====
|
||||
If an extension is not intended for private use on a company's particular
|
||||
hardware, runtime or environment, it is recommended that they disclose the
|
||||
extension to the WG.
|
||||
This may occur during the step of merge request creation, but may occur at
|
||||
any point the vendor desires.
|
||||
|
||||
This suggestion is based on the fact that other companies may be willing to
|
||||
collaborate on the design of an `EXT` for common behavior.
|
||||
`EXT` extensions are preferable for application developers since they are
|
||||
guaranteed to work across more than one vendor.
|
||||
|
||||
However, sometimes a vendor may desire speed over collaboration and should:
|
||||
not feel pressured into always using the `EXT` path for extensions.
|
||||
====
|
||||
|
||||
|
||||
=== Extension Deprecation Process
|
||||
|
||||
Extensions can be deprecated for several reasons:
|
||||
|
||||
1. The extension is no longer useful or supported.
|
||||
2. The extension has been replaced by another extension.
|
||||
3. The extension functionality has been merged into the core OpenXR API
|
||||
|
||||
In the case of either 1 or 2, what could happen is that Runtime vendors will
|
||||
simply stop exporting support for a deprecated extension at some point in
|
||||
the future.
|
||||
This is okay since extensions are optional.
|
||||
It is recommended that a transition period occur with some kind of warning
|
||||
indicating that the extension is going away since applications could be
|
||||
written that depend on it.
|
||||
|
||||
However, because released applications could depend on an extension, it is
|
||||
preferable to support that extension at least until a new version of the
|
||||
OpenXR API is released.
|
||||
For example, if an extension is deprecated and it is written as part of
|
||||
OpenXR 1.0, it is recommended that Runtimes remove support for that
|
||||
extension no sooner than the release of OpenXR 1.1.
|
||||
|
||||
Deprecated extensions will continue to be listed in the OpenXR API, with
|
||||
some indication that they have been deprecated, until at least the next
|
||||
Major version bump of the OpenXR API.
|
||||
This is also the case for extension features that have been merged into the
|
||||
OpenXR core API (3 above).
|
||||
In fact, it is recommended that on a major API version increase, that we
|
||||
re-evaluate all available extensions and determine if they should be removed
|
||||
from the API.
|
||||
|
||||
In comparison, Vulkan is currently thinking that no extensions will survive
|
||||
the API bump from one major version to another (i.e 1.x to 2.x).
|
||||
In this way, they can completely clean the slate of extensions.
|
||||
OpenXR may, or may not want to do the same.
|
||||
|
||||
[WARNING]
|
||||
.TODO (i/769)
|
||||
====
|
||||
Decide if OpenXR wants to do the same thing as Vulkan for extensions.
|
||||
====
|
|
@ -0,0 +1,110 @@
|
|||
include::../meta/XR_OCULUS_android_session_state_enable.adoc[]
|
||||
|
||||
*Overview*
|
||||
|
||||
This extension enables the integration of the Android session lifecycle and
|
||||
an OpenXR runtime session state.
|
||||
Some OpenXR runtimes may require this extension to transition the
|
||||
application to the session READY or STOPPING state.
|
||||
|
||||
Applications that run on an Android system with this extension enabled have
|
||||
a different OpenXR Session state flow.
|
||||
|
||||
On Android, it is the Android Activity lifecycle that will dictate when the
|
||||
system is ready for the application to begin or end its session, not the
|
||||
runtime.
|
||||
|
||||
When XR_OCULUS_android_session_state is enabled, the following changes are
|
||||
made to Session State handling:
|
||||
|
||||
- The runtime does not determine when the application's session should be
|
||||
moved to the ready state, ename:XR_SESSION_STATE_READY.
|
||||
The application should not wait to receive the ename:XR_SESSION_STATE_READY
|
||||
session state changed event before beginning a session.
|
||||
Instead, the application should begin their session once there is a surface
|
||||
and the activity is resumed.
|
||||
|
||||
- The application should not call flink:xrRequestExitSession to request the
|
||||
session move to the stopping state, ename:XR_SESSION_STATE_STOPPING.
|
||||
flink:xrRequestExitSession will return ename:XR_ERROR_VALIDATION_FAILURE if
|
||||
called.
|
||||
|
||||
- The application should not wait to receive the
|
||||
ename:XR_SESSION_STATE_STOPPING session state changed event before ending a
|
||||
session.
|
||||
Instead, the application should end its session once the surface is
|
||||
destroyed or the activity is paused.
|
||||
|
||||
- The runtime will not transition to ename:XR_SESSION_STATE_READY or
|
||||
ename:XR_SESSION_STATE_STOPPING as the state is implicit from the Android
|
||||
activity and surface lifecycles.
|
||||
|
||||
|
||||
*Android Activity life cycle*
|
||||
|
||||
An Android Activity can only be in the session running state while the
|
||||
activity is in the resumed state.
|
||||
The following shows how beginning and ending an XR session fits into the
|
||||
Android Activity life cycle.
|
||||
|
||||
[source]
|
||||
----
|
||||
1. VrActivity::onCreate() <---------+
|
||||
2. VrActivity::onStart() <-------+ |
|
||||
3. VrActivity::onResume() <---+ | |
|
||||
4. xrBeginSession() | | |
|
||||
5. xrEndSession() | | |
|
||||
6. VrActivity::onPause() -----+ | |
|
||||
7. VrActivity::onStop() ---------+ |
|
||||
8. VrActivity::onDestroy() ---------+
|
||||
----
|
||||
|
||||
*Android Surface life cycle*
|
||||
|
||||
An Android Activity can only be in the session running state while there is
|
||||
a valid Android Surface.
|
||||
The following shows how beginning and ending an XR session fits into the
|
||||
Android Surface life cycle.
|
||||
|
||||
[source]
|
||||
----
|
||||
1. VrActivity::surfaceCreated() <----+
|
||||
2. VrActivity::surfaceChanged() |
|
||||
3. xrBeginSession() |
|
||||
4. xrEndSession() |
|
||||
5. VrActivity::surfaceDestroyed() ---+
|
||||
----
|
||||
|
||||
Note that the life cycle of a surface is not necessarily tightly coupled
|
||||
with the life cycle of an activity.
|
||||
These two life cycles may interleave in complex ways.
|
||||
Usually surfaceCreated() is called after onResume() and surfaceDestroyed()
|
||||
is called between onPause() and onDestroy().
|
||||
However, this is not guaranteed and, for instance, surfaceDestroyed() may be
|
||||
called after onDestroy() or even before onPause().
|
||||
|
||||
An Android Activity is only in the resumed state with a valid Android
|
||||
Surface between surfaceChanged() or onResume(), whichever comes last, and
|
||||
surfaceDestroyed() or onPause(), whichever comes first.
|
||||
In other words, a XR application will typically begin the session from
|
||||
surfaceChanged() or onResume(), whichever comes last, and end the session
|
||||
from surfaceDestroyed() or onPause(), whichever comes first.
|
||||
|
||||
*New Object Types*
|
||||
|
||||
*New Flag Types*
|
||||
|
||||
*New Enum Constants*
|
||||
|
||||
*New Enums*
|
||||
|
||||
*New Structures*
|
||||
|
||||
*New Functions*
|
||||
|
||||
*Issues*
|
||||
|
||||
*Version History*
|
||||
|
||||
* Revision 1, 2019-08-16 (Cass Everitt)
|
||||
** Initial extension description
|
Loading…
Reference in New Issue