Browse Source

Merge pull request #42 from scossu/development

Alpha 12 development merge.
Stefano Cossu 7 years ago
parent
commit
d683573c07

+ 11 - 0
.travis.yml

@@ -6,3 +6,14 @@ install:
   - pip install -e .
   - pip install -e .
 script:
 script:
   - python setup.py test
   - python setup.py test
+
+deploy:
+    provider: pypi
+    user: "scossu"
+    password:
+        secure: "ANSqNv9T5AjDh2hkcWtikwxGu+MVmUC1K8s0QUZwGFfaLoNhwAe+Ol+a12It/oSQumZZQTPImpqvJ2dp6KaUXVvury9AI6La48lTinHNlZkRgLKhdqg0XV2ByxKkBxL0lmixtS+o0Ynv5CVX76iBxoaFTKU/eRMF9Pja6UvjNC7CZM+uh3C5/MUg82RdOS01R7m7SmM9uMTIoMzWb87837stTBmL8FiN3BkX25Weo4NDrLDamKl8QlFx2ozqkOj9SYJLO/HHhPv3HfSJeWNC6fsbNud9OAvKu+ZckPdVw1yNgjeTqpxhL7S/K0GuqZJ/efdwwPZLlsP+dSMSB3ftpUucpp3cBNOOjCvE+KHUWbHvIKJijwkMbVp/N/RWgfSzzwVlpy28JFzZirgvI0VGOovYI1NOW+kwe6aAffM0C00WA16bGZxxCDXeK2CeNDOpjXb0UhtwJTEayfpcRXEiginOaoUXISahPLnhVQoGLuyM+UG6oFg8RURAziXNOfaI6VgzcOF6EcfBhQlLs10RDVnfl9giP1kQ6twko/+n3bbRURDe1YXxk9HLwlzOszv8KGFU0G5UjRaX76RtMh5Y+a8wqni7g8ti74QiDmgG8a7aGZu9VUrLGnl1iRrM+xmoogYSuB7OxeUu+k+2mOJTHNz9qP+0+/FEeKazHoH8SmQ="
+    on:
+        tags: true
+        branch: master
+    distributions: "bdist_wheel"
+

+ 1 - 1
data/bootstrap/rsrc_centric_layout.sparql

@@ -11,7 +11,7 @@ INSERT DATA {
   GRAPH <info:fcsystem/graph/admin/> {
   GRAPH <info:fcsystem/graph/admin/> {
     <info:fcres/> a
     <info:fcres/> a
       fcrepo:RepositoryRoot , fcrepo:Resource , fcrepo:Container ,
       fcrepo:RepositoryRoot , fcrepo:Resource , fcrepo:Container ,
-      ldp:Container , ldp:BasicContainer , ldp:RDFSource ;
+      ldp:Resource , ldp:Container , ldp:BasicContainer , ldp:RDFSource ;
       fcrepo:created "$timestamp"^^xsd:dateTime ;
       fcrepo:created "$timestamp"^^xsd:dateTime ;
       fcrepo:lastModified "$timestamp"^^xsd:dateTime ;
       fcrepo:lastModified "$timestamp"^^xsd:dateTime ;
     .
     .

BIN
docs/assets/lsup_sparql_query_ui.png


+ 72 - 0
docs/discovery.rst

@@ -0,0 +1,72 @@
+Resource Discovery & Query
+==========================
+
+LAKEsuperior offers several way to programmatically discover resources and
+data.
+
+LDP Traversal
+-------------
+
+The method compatible with the standard Fedora implementation and other LDP
+servers is to simply traverse the LDP tree. While this offers the broadest
+compatibility, it is quite expensive for the client, the server and the
+developer.
+
+For this method, please consult the dedicated `LDP specifications
+<https://www.w3.org/TR/ldp/>`__ and `Fedora API specs
+<https://wiki.duraspace.org/display/FEDORA4x/RESTful+HTTP+API+-+Containers>`__.
+
+SPARQL Query
+------------
+
+A `SPARQL <https://www.w3.org/TR/sparql11-query/>`__ endpoint is available in
+LAKEsuperior both as an API and a Web UI.
+
+.. figure:: assets/lsup_sparql_query_ui.png
+   :alt: LAKEsuperior SPARQL Query Window
+
+   LAKEsuperior SPARQL Query Window
+
+The UI is based on `YASGUI <http://about.yasgui.org/>`__.
+
+Note that:
+
+#. The SPARQL endpoint only supports the SPARQL 1.1 Query language.
+   SPARQL updates are not, and will not be, supported.
+#. The LAKEshore data model has an added layer of structure that is not exposed
+   through the LDP layer. The SPARQL endpoint exposes this low-level structure
+   and it is beneficial to understand its layout. See :doc:`model` for details
+   in this regard.
+#. The underlying RDF structure is mostly in the RDF named graphs. Querying
+   only triples will give a quite uncluttered view of the data, as close to the
+   LDP representation as possible.
+
+SPARQL Caveats
+~~~~~~~~~~~~~~
+
+The SPARQL query facility has not yet been tested thoroughly. the RDFLib
+implementation that it is based upon can be quite efficient for certain
+queries but has some downsides. For example, do **not** attempt the following
+query in a graph with more than a few thousands resources::
+
+    SELECT ?p ?o {
+      GRAPH ?g {
+        <info:fcres/my-uid> ?p ?o .
+      }
+    }
+
+What the RDFLib implementation does is going over every single graph in the
+repository and perform the ``?s ?p ?o`` query on each of them. Since
+LAKEsuperior creates several graphs per resource, this can run for a very long
+time in any decently sized data set.
+
+The solution to this is either to omit the graph query, or use a term search,
+or a native Python method if applicable.
+
+Term Search
+-----------
+
+This feature has not yet been implemented. It is meant to provide a discovery
+tool based on simple term match, and possibly comparison. It should be more
+efficient and predictable than SPARQL.
+

+ 13 - 0
docs/fcrepo4_deltas.rst

@@ -76,6 +76,19 @@ identifiers will be different).
 This seems to break Hyrax at some point, but might have been fixed. This
 This seems to break Hyrax at some point, but might have been fixed. This
 needs to be verified further.
 needs to be verified further.
 
 
+Allow PUT requests with empty body on existing resources
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+FCREPO4 returns a ``409 Conflict`` if a PUT request with no payload is sent
+to an existing resource.
+
+LAKEsuperior allows to perform this operation, which would result in deleting
+all the user-provided properties in that resource.
+
+If the original resource is an LDP-NR, however, the operation will raise a
+``415 Unsupported Media Type`` because the resource will be treated as an empty
+LDP-RS, which cannot replace an existing LDP-NR.
+
 Non-standard client breaking changes
 Non-standard client breaking changes
 ------------------------------------
 ------------------------------------
 
 

+ 1 - 0
docs/index.rst

@@ -30,6 +30,7 @@ Indices and tables
    :maxdepth: 3
    :maxdepth: 3
    :caption: User Reference
    :caption: User Reference
 
 
+    Discovery & Query <discovery>
     Divergences from Fedora 4 <fcrepo4_deltas>
     Divergences from Fedora 4 <fcrepo4_deltas>
     Messaging <messaging>
     Messaging <messaging>
     Migration Guide <migration>
     Migration Guide <migration>

+ 106 - 1
docs/usage.rst

@@ -114,4 +114,109 @@ Immediately forget a resource
 Python API
 Python API
 ----------
 ----------
 
 
-**TODO**
+Set up the environment
+~~~~~~~~~~~~~~~~~~~~~~
+
+Before using the API, either do::
+
+    >>> import lakesuperior.env_setup
+
+Or, to specify an alternative configuration::
+
+    >>> from lakesuperior.config_parser import parse_config
+    >>> from lakesuperior.globals import AppGlobals
+    >>> env.config, test_config = parse_config('/my/custom/config_dir')
+    Reading configuration at /my/custom/config_dir
+    >>> env.app_globals = AppGlobals(env.config)
+
+Create and replace resources
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Create an LDP-RS (RDF reseouce) providng a Graph object::
+
+    >>> from rdflib import Graph, URIRef
+    >>> uid = '/rsrc_from_graph'
+    >>> gr = Graph().parse(data='<> a <http://ex.org/type#A> .',
+    ...     format='text/turtle', publicID=nsc['fcres'][uid])
+    >>> rsrc_api.create_or_replace(uid, init_gr=gr)
+
+Issuing a ``create_or_replace()`` on an existing UID will replace the existing
+property set with the provided one (PUT style).
+
+Create an LDP-NR (non-RDF source)::
+
+    >>> uid = '/test_ldpnr01'
+    >>> data = b'Hello. This is some dummy content.'
+    >>> rsrc_api.create_or_replace(
+    ...     uid, stream=BytesIO(data), mimetype='text/plain')
+    '_create_'
+
+Create under a known parent, providing a slug (POST style)::
+
+    >>> rsrc_api.create('/rsrc_from_stream', 'res1')
+
+
+Retrieve Resources
+~~~~~~~~~~~~~~~~~~
+
+Retrieve a resource::
+
+    >>> rsrc = rsrc_api.get('/rsrc_from_stream')
+    >>> rsrc.uid
+    '/rsrc_from_stream'
+    >>> rsrc.uri
+    rdflib.term.URIRef('info:fcres/rsrc_from_stream')
+    >>> set(rsrc.metadata)
+    {(rdflib.term.URIRef('info:fcres/rsrc_from_stream'),
+      rdflib.term.URIRef('http://fedora.info/definitions/v4/repository#created'),
+      rdflib.term.Literal('2018-04-06T03:30:49.460274+00:00', datatype=rdflib.term.URIRef('http://www.w3.org/2001/XMLSchema#dateTime'))),
+    [...]
+
+Retrieve non-RDF content::
+
+    >>> ldpnr = rsrc_api.get('/test_ldpnr01')
+    >>> ldpnr.content.read()
+    b'Hello. This is some dummy content.'
+
+See the :doc:`API docs <api>` for more details on resource methods.
+
+Update Resources
+~~~~~~~~~~~~~~~~
+
+Using a SPARQL update string::
+
+    >>> uid = '/test_delta_patch_wc'
+    >>> uri = nsc['fcres'][uid]
+    >>> init_trp = {
+    ...     (URIRef(uri), nsc['rdf'].type, nsc['foaf'].Person),
+    ...     (URIRef(uri), nsc['foaf'].name, Literal('Joe Bob')),
+    ...     (URIRef(uri), nsc['foaf'].name, Literal('Joe Average Bob')),
+    ... }
+
+    >>> update_str = '''
+    ... DELETE {}
+    ... INSERT { <> foaf:name "Joe Average 12oz Bob" . }
+    ... WHERE {}
+    ... '''
+
+Using add/remove triple sets::
+
+    >>> remove_trp = {
+    ...     (URIRef(uri), nsc['foaf'].name, None),
+    ... }
+    >>> add_trp = {
+    ...     (URIRef(uri), nsc['foaf'].name, Literal('Joan Knob')),
+    ... }
+
+    >>> gr = Graph()
+    >>> gr += init_trp
+    >>> rsrc_api.create_or_replace(uid, graph=gr)
+    >>> rsrc_api.update_delta(uid, remove_trp, add_trp)
+
+Note above that wildcards can be used, only in the remove triple set. Wherever
+``None`` is used, all matches will be removed (in this example, all values of
+``foaf:name``.
+
+Generally speaking, the delta approach providing a set of remove triples and/or
+a set of add triples is more convenient than SPARQL, which is a better fit for
+complex query/update scenarios.

+ 42 - 29
lakesuperior/api/resource.py

@@ -7,14 +7,14 @@ from threading import Lock, Thread
 
 
 import arrow
 import arrow
 
 
-from rdflib import Literal
+from rdflib import Graph, Literal, URIRef
 from rdflib.namespace import XSD
 from rdflib.namespace import XSD
 
 
 from lakesuperior.config_parser import config
 from lakesuperior.config_parser import config
 from lakesuperior.exceptions import (
 from lakesuperior.exceptions import (
         InvalidResourceError, ResourceNotExistsError, TombstoneError)
         InvalidResourceError, ResourceNotExistsError, TombstoneError)
 from lakesuperior.env import env
 from lakesuperior.env import env
-from lakesuperior.globals import RES_DELETED
+from lakesuperior.globals import RES_DELETED, RES_UPDATED
 from lakesuperior.model.ldp_factory import LDP_NR_TYPE, LdpFactory
 from lakesuperior.model.ldp_factory import LDP_NR_TYPE, LdpFactory
 from lakesuperior.store.ldp_rs.lmdb_store import TxnManager
 from lakesuperior.store.ldp_rs.lmdb_store import TxnManager
 
 
@@ -77,7 +77,7 @@ def transaction(write=False):
             with TxnManager(env.app_globals.rdf_store, write=write) as txn:
             with TxnManager(env.app_globals.rdf_store, write=write) as txn:
                 ret = fn(*args, **kwargs)
                 ret = fn(*args, **kwargs)
             if len(env.app_globals.changelog):
             if len(env.app_globals.changelog):
-                job = Thread(target=process_queue)
+                job = Thread(target=_process_queue)
                 job.start()
                 job.start()
             delattr(env, 'timestamp')
             delattr(env, 'timestamp')
             delattr(env, 'timestamp_term')
             delattr(env, 'timestamp_term')
@@ -86,18 +86,18 @@ def transaction(write=False):
     return _transaction_deco
     return _transaction_deco
 
 
 
 
-def process_queue():
+def _process_queue():
     """
     """
     Process the message queue on a separate thread.
     Process the message queue on a separate thread.
     """
     """
     lock = Lock()
     lock = Lock()
     lock.acquire()
     lock.acquire()
     while len(env.app_globals.changelog):
     while len(env.app_globals.changelog):
-        send_event_msg(*env.app_globals.changelog.popleft())
+        _send_event_msg(*env.app_globals.changelog.popleft())
     lock.release()
     lock.release()
 
 
 
 
-def send_event_msg(remove_trp, add_trp, metadata):
+def _send_event_msg(remove_trp, add_trp, metadata):
     """
     """
     Send messages about a changed LDPR.
     Send messages about a changed LDPR.
 
 
@@ -199,7 +199,8 @@ def create(parent, slug, **kwargs):
     :param str parent: UID of the parent resource.
     :param str parent: UID of the parent resource.
     :param str slug: Tentative path relative to the parent UID.
     :param str slug: Tentative path relative to the parent UID.
     :param \*\*kwargs: Other parameters are passed to the
     :param \*\*kwargs: Other parameters are passed to the
-      :meth:`LdpFactory.from_provided` method.
+      :py:meth:`~lakesuperior.model.ldp_factory.LdpFactory.from_provided`
+      method.
 
 
     :rtype: str
     :rtype: str
     :return: UID of the new resource.
     :return: UID of the new resource.
@@ -214,31 +215,19 @@ def create(parent, slug, **kwargs):
 
 
 
 
 @transaction(True)
 @transaction(True)
-def create_or_replace(uid, stream=None, **kwargs):
+def create_or_replace(uid, **kwargs):
     r"""
     r"""
     Create or replace a resource with a specified UID.
     Create or replace a resource with a specified UID.
 
 
-    If the resource already exists, all user-provided properties of the
-    existing resource are deleted. If the resource exists and the provided
-    content is empty, an exception is raised (not sure why, but that's how
-    FCREPO4 handles it).
-
     :param string uid: UID of the resource to be created or updated.
     :param string uid: UID of the resource to be created or updated.
-    :param BytesIO stream: Content stream. If empty, an empty container is
-        created.
     :param \*\*kwargs: Other parameters are passed to the
     :param \*\*kwargs: Other parameters are passed to the
-        :meth:`LdpFactory.from_provided` method.
+        :py:meth:`~lakesuperior.model.ldp_factory.LdpFactory.from_provided`
+        method.
 
 
     :rtype: str
     :rtype: str
     :return: Event type: whether the resource was created or updated.
     :return: Event type: whether the resource was created or updated.
     """
     """
-    rsrc = LdpFactory.from_provided(uid, stream=stream, **kwargs)
-
-    if not stream and rsrc.is_stored:
-        raise InvalidResourceError(rsrc.uid,
-                'Resource {} already exists and no data set was provided.')
-
-    return rsrc.create_or_replace()
+    return LdpFactory.from_provided(uid, **kwargs).create_or_replace()
 
 
 
 
 @transaction(True)
 @transaction(True)
@@ -248,19 +237,43 @@ def update(uid, update_str, is_metadata=False):
 
 
     :param string uid: Resource UID.
     :param string uid: Resource UID.
     :param string update_str: SPARQL-Update statements.
     :param string update_str: SPARQL-Update statements.
-    :param bool is_metadata: Whether the resource metadata is being updated.
-        If False, and the resource being updated is a LDP-NR, an error is
-        raised.
+    :param bool is_metadata: Whether the resource metadata are being updated.
+
+    :raise InvalidResourceError: If ``is_metadata`` is False and the resource
+        being updated is a LDP-NR.
     """
     """
-    rsrc = LdpFactory.from_stored(uid)
+    # FCREPO is lenient here and Hyrax requires it.
+    rsrc = LdpFactory.from_stored(uid, handling='lenient')
     if LDP_NR_TYPE in rsrc.ldp_types and not is_metadata:
     if LDP_NR_TYPE in rsrc.ldp_types and not is_metadata:
-        raise InvalidResourceError(uid)
+        raise InvalidResourceError(
+                'Cannot use this method to update an LDP-NR content.')
 
 
-    rsrc.sparql_update(update_str)
+    delta = rsrc.sparql_delta(update_str)
+    rsrc.modify(RES_UPDATED, *delta)
 
 
     return rsrc
     return rsrc
 
 
 
 
+@transaction(True)
+def update_delta(uid, remove_trp, add_trp):
+    """
+    Update a resource graph (LDP-RS or LDP-NR) with sets of add/remove triples.
+
+    A set of triples to add and/or a set of triples to remove may be provided.
+
+    :param string uid: Resource UID.
+    :param set(tuple(rdflib.term.Identifier)) remove_trp: Triples to
+        remove, as 3-tuples of RDFLib terms.
+    :param set(tuple(rdflib.term.Identifier)) add_trp: Triples to
+        add, as 3-tuples of RDFLib terms.
+    """
+    rsrc = LdpFactory.from_stored(uid)
+    remove_trp = rsrc.check_mgd_terms(remove_trp)
+    add_trp = rsrc.check_mgd_terms(add_trp)
+
+    return rsrc.modify(RES_UPDATED, remove_trp, add_trp)
+
+
 @transaction(True)
 @transaction(True)
 def create_version(uid, ver_uid):
 def create_version(uid, ver_uid):
     """
     """

+ 7 - 7
lakesuperior/endpoints/ldp.py

@@ -11,8 +11,7 @@ import arrow
 from flask import (
 from flask import (
         Blueprint, g, make_response, render_template,
         Blueprint, g, make_response, render_template,
         request, send_file)
         request, send_file)
-from rdflib.namespace import XSD
-from rdflib.term import Literal
+from rdflib import Graph
 
 
 from lakesuperior.api import resource as rsrc_api
 from lakesuperior.api import resource as rsrc_api
 from lakesuperior.dictionaries.namespaces import ns_collection as nsc
 from lakesuperior.dictionaries.namespaces import ns_collection as nsc
@@ -281,14 +280,15 @@ def put_resource(uid):
         # If the content is RDF, localize in-repo URIs.
         # If the content is RDF, localize in-repo URIs.
         global_rdf = stream.read()
         global_rdf = stream.read()
         local_rdf = g.tbox.localize_payload(global_rdf)
         local_rdf = g.tbox.localize_payload(global_rdf)
-        stream = BytesIO(local_rdf)
-        is_rdf = True
+        graph = Graph().parse(
+                data=local_rdf, format=mimetype, publicID=nsc['fcres'][uid])
+        stream = mimetype = None
     else:
     else:
-        is_rdf = False
+        graph = None
 
 
     try:
     try:
         evt = rsrc_api.create_or_replace(uid, stream=stream, mimetype=mimetype,
         evt = rsrc_api.create_or_replace(uid, stream=stream, mimetype=mimetype,
-                handling=handling, disposition=disposition)
+                graph=graph, handling=handling, disposition=disposition)
     except (InvalidResourceError, ResourceExistsError) as e:
     except (InvalidResourceError, ResourceExistsError) as e:
         return str(e), 409
         return str(e), 409
     except (ServerManagedTermError, SingleSubjectError) as e:
     except (ServerManagedTermError, SingleSubjectError) as e:
@@ -302,7 +302,7 @@ def put_resource(uid):
     if evt == RES_CREATED:
     if evt == RES_CREATED:
         rsp_code = 201
         rsp_code = 201
         rsp_headers['Location'] = rsp_body = uri
         rsp_headers['Location'] = rsp_body = uri
-        if mimetype and not is_rdf:
+        if mimetype and not graph:
             rsp_headers['Link'] = (
             rsp_headers['Link'] = (
                     '<{0}/fcr:metadata>; rel="describedby"'.format(uri))
                     '<{0}/fcr:metadata>; rel="describedby"'.format(uri))
     else:
     else:

+ 39 - 39
lakesuperior/model/ldp_factory.py

@@ -36,7 +36,7 @@ class LdpFactory:
             raise InvalidResourceError(uid)
             raise InvalidResourceError(uid)
         if rdfly.ask_rsrc_exists(uid):
         if rdfly.ask_rsrc_exists(uid):
             raise ResourceExistsError(uid)
             raise ResourceExistsError(uid)
-        rsrc = Ldpc(uid, provided_imr=Resource(Graph(), nsc['fcres'][uid]))
+        rsrc = Ldpc(uid, provided_imr=Graph(identifier=nsc['fcres'][uid]))
 
 
         return rsrc
         return rsrc
 
 
@@ -59,8 +59,8 @@ class LdpFactory:
 
 
         rsrc_meta = rdfly.get_metadata(uid)
         rsrc_meta = rdfly.get_metadata(uid)
         #logger.debug('Extracted metadata: {}'.format(
         #logger.debug('Extracted metadata: {}'.format(
-        #        pformat(set(rsrc_meta.graph))))
-        rdf_types = set(rsrc_meta.graph[imr_urn : RDF.type])
+        #        pformat(set(rsrc_meta))))
+        rdf_types = set(rsrc_meta[imr_urn : RDF.type])
 
 
         if LDP_NR_TYPE in rdf_types:
         if LDP_NR_TYPE in rdf_types:
             logger.info('Resource is a LDP-NR.')
             logger.info('Resource is a LDP-NR.')
@@ -78,38 +78,43 @@ class LdpFactory:
 
 
 
 
     @staticmethod
     @staticmethod
-    def from_provided(
-            uid, mimetype=None, stream=None, provided_imr=None, **kwargs):
+    def from_provided(uid, mimetype=None, stream=None, graph=None, **kwargs):
         r"""
         r"""
-        Determine LDP type from request content.
+        Create and LDPR instance from provided data.
+
+        the LDP class (LDP-RS, LDP_NR, etc.) is determined by the contents
+        passed.
 
 
         :param str uid: UID of the resource to be created or updated.
         :param str uid: UID of the resource to be created or updated.
-        :param str mimetype: The provided content MIME type.
-        :param stream: The provided data stream. This can be
-            RDF or non-RDF content, or None. In the latter case, an empty
-            container is created.
-        :type stream: IOStream or None
+        :param str mimetype: The provided content MIME type. If this is
+            specified the resource is considered a LDP-NR and a ``stream``
+            *must* be provided.
+        :param IOStream stream: The provided data stream.
+        :param rdflib.Graph graph: Initial graph to populate the
+            resource with. This can be used for LDP-RS and LDP-NR types alike.
         :param \*\*kwargs: Arguments passed to the LDP class constructor.
         :param \*\*kwargs: Arguments passed to the LDP class constructor.
+
+        :raise ValueError: if ``mimetype`` is specified but no data stream is
+            provided.
         """
         """
         uri = nsc['fcres'][uid]
         uri = nsc['fcres'][uid]
 
 
-        if not stream and not mimetype:
-            # Create empty LDPC.
-            logger.info('No data received in request. '
-                    'Creating empty container.')
-            inst = Ldpc(uid, provided_imr=Resource(Graph(), uri), **kwargs)
-        elif __class__.is_rdf_parsable(mimetype):
-            # Create container and populate it with provided RDF data.
-            input_rdf = stream.read()
-            gr = Graph().parse(data=input_rdf, format=mimetype, publicID=uri)
-            #logger.debug('Provided graph: {}'.format(
-            #        pformat(set(provided_gr))))
-            provided_imr = Resource(gr, uri)
+        provided_imr = Graph(identifier=uri)
+        if graph:
+            provided_imr += graph
+        #logger.debug('Provided graph: {}'.format(
+        #        pformat(set(provided_imr))))
+
+        if stream is None:
+            # Resource is a LDP-RS.
+            if mimetype:
+                raise ValueError(
+                    'Binary stream must be provided if mimetype is specified.')
 
 
             # Determine whether it is a basic, direct or indirect container.
             # Determine whether it is a basic, direct or indirect container.
-            if Ldpr.MBR_RSRC_URI in gr.predicates() and \
-                    Ldpr.MBR_REL_URI in gr.predicates():
-                if Ldpr.INS_CNT_REL_URI in gr.predicates():
+            if Ldpr.MBR_RSRC_URI in provided_imr.predicates() and \
+                    Ldpr.MBR_REL_URI in provided_imr.predicates():
+                if Ldpr.INS_CNT_REL_URI in provided_imr.predicates():
                     cls = LdpIc
                     cls = LdpIc
                 else:
                 else:
                     cls = LdpDc
                     cls = LdpDc
@@ -118,33 +123,28 @@ class LdpFactory:
 
 
             inst = cls(uid, provided_imr=provided_imr, **kwargs)
             inst = cls(uid, provided_imr=provided_imr, **kwargs)
 
 
-            # Make sure we are not updating an LDP-RS with an LDP-NR.
+            # Make sure we are not updating an LDP-NR with an LDP-RS.
             if inst.is_stored and LDP_NR_TYPE in inst.ldp_types:
             if inst.is_stored and LDP_NR_TYPE in inst.ldp_types:
                 raise IncompatibleLdpTypeError(uid, mimetype)
                 raise IncompatibleLdpTypeError(uid, mimetype)
 
 
             if kwargs.get('handling', 'strict') != 'none':
             if kwargs.get('handling', 'strict') != 'none':
-                inst._check_mgd_terms(inst.provided_imr.graph)
+                inst.check_mgd_terms(inst.provided_imr)
 
 
         else:
         else:
-            # Create a LDP-NR and equip it with the binary file provided.
-            # The IMR can also be provided for additional metadata.
-            if not provided_imr:
-                provided_imr = Resource(Graph(), uri)
+            # Resource is a LDP-NR.
+            if not mimetype:
+                mimetype = 'application/octet-stream'
+
             inst = LdpNr(uid, stream=stream, mimetype=mimetype,
             inst = LdpNr(uid, stream=stream, mimetype=mimetype,
                     provided_imr=provided_imr, **kwargs)
                     provided_imr=provided_imr, **kwargs)
 
 
-            # Make sure we are not updating an LDP-NR with an LDP-RS.
+            # Make sure we are not updating an LDP-RS with an LDP-NR.
             if inst.is_stored and LDP_RS_TYPE in inst.ldp_types:
             if inst.is_stored and LDP_RS_TYPE in inst.ldp_types:
                 raise IncompatibleLdpTypeError(uid, mimetype)
                 raise IncompatibleLdpTypeError(uid, mimetype)
 
 
-        logger.info('Creating resource of type: {}'.format(
+        logger.debug('Creating resource of type: {}'.format(
                 inst.__class__.__name__))
                 inst.__class__.__name__))
 
 
-        try:
-            types = inst.types
-        except (TombstoneError, ResourceNotExistsError):
-            types = set()
-
         return inst
         return inst
 
 
 
 

+ 34 - 10
lakesuperior/model/ldp_nr.py

@@ -45,7 +45,7 @@ class LdpNr(Ldpr):
 
 
         if not mimetype:
         if not mimetype:
             self.mimetype = (
             self.mimetype = (
-                    self.metadata.value(nsc['ebucore'].hasMimeType)
+                    self.metadata.value(self.uri, nsc['ebucore'].hasMimeType)
                     if self.is_stored
                     if self.is_stored
                     else 'application/octet-stream')
                     else 'application/octet-stream')
         else:
         else:
@@ -56,13 +56,34 @@ class LdpNr(Ldpr):
 
 
     @property
     @property
     def filename(self):
     def filename(self):
-        return self.imr.value(nsc['ebucore'].filename)
+        """
+        File name of the original uploaded file.
+
+        :rtype: str
+        """
+        return self.imr.value(self.uri, nsc['ebucore'].filename)
+
+
+    @property
+    def content(self):
+        """
+        Binary content.
+
+        :return: File handle of the resource content.
+        :rtype: io.BufferedReader
+        """
+        return open(self.local_path, 'rb')
 
 
 
 
     @property
     @property
     def local_path(self):
     def local_path(self):
-        cksum_term = self.imr.value(nsc['premis'].hasMessageDigest)
-        cksum = str(cksum_term.identifier.replace('urn:sha1:',''))
+        """
+        Path on disk of the binary content.
+
+        :rtype: str
+        """
+        cksum_term = self.imr.value(self.uri, nsc['premis'].hasMessageDigest)
+        cksum = str(cksum_term.replace('urn:sha1:',''))
         return nonrdfly.__class__.local_path(
         return nonrdfly.__class__.local_path(
                 nonrdfly.root, cksum, nonrdfly.bl, nonrdfly.bc)
                 nonrdfly.root, cksum, nonrdfly.bl, nonrdfly.bc)
 
 
@@ -104,20 +125,23 @@ class LdpNr(Ldpr):
 
 
         # File size.
         # File size.
         logger.debug('Data stream size: {}'.format(self.size))
         logger.debug('Data stream size: {}'.format(self.size))
-        self.provided_imr.set(nsc['premis'].hasSize, Literal(self.size))
+        self.provided_imr.set((
+            self.uri, nsc['premis'].hasSize, Literal(self.size)))
 
 
         # Checksum.
         # Checksum.
         cksum_term = URIRef('urn:sha1:{}'.format(self.digest))
         cksum_term = URIRef('urn:sha1:{}'.format(self.digest))
-        self.provided_imr.set(nsc['premis'].hasMessageDigest, cksum_term)
+        self.provided_imr.set((
+            self.uri, nsc['premis'].hasMessageDigest, cksum_term))
 
 
         # MIME type.
         # MIME type.
-        self.provided_imr.set(nsc['ebucore']['hasMimeType'], 
-                Literal(self.mimetype))
+        self.provided_imr.set((
+            self.uri, nsc['ebucore']['hasMimeType'], Literal(self.mimetype)))
 
 
         # File name.
         # File name.
         logger.debug('Disposition: {}'.format(self.disposition))
         logger.debug('Disposition: {}'.format(self.disposition))
         try:
         try:
-            self.provided_imr.set(nsc['ebucore']['filename'], Literal(
-                    self.disposition['attachment']['parameters']['filename']))
+            self.provided_imr.set((
+                self.uri, nsc['ebucore']['filename'], Literal(
+                self.disposition['attachment']['parameters']['filename'])))
         except (KeyError, TypeError) as e:
         except (KeyError, TypeError) as e:
             pass
             pass

+ 188 - 171
lakesuperior/model/ldpr.py

@@ -2,12 +2,12 @@ import logging
 
 
 from abc import ABCMeta
 from abc import ABCMeta
 from collections import defaultdict
 from collections import defaultdict
+from urllib.parse import urldefrag
 from uuid import uuid4
 from uuid import uuid4
 
 
 import arrow
 import arrow
 
 
 from rdflib import Graph, URIRef, Literal
 from rdflib import Graph, URIRef, Literal
-from rdflib.resource import Resource
 from rdflib.namespace import RDF
 from rdflib.namespace import RDF
 
 
 from lakesuperior.env import env
 from lakesuperior.env import env
@@ -28,15 +28,13 @@ logger = logging.getLogger(__name__)
 
 
 
 
 class Ldpr(metaclass=ABCMeta):
 class Ldpr(metaclass=ABCMeta):
-    """LDPR (LDP Resource).
-
-    Definition: https://www.w3.org/TR/ldp/#ldpr-resource
+    """
+    LDPR (LDP Resource).
 
 
     This class and related subclasses contain the implementation pieces of
     This class and related subclasses contain the implementation pieces of
-    the vanilla LDP specifications. This is extended by the
-    `lakesuperior.fcrepo.Resource` class.
-
-    See inheritance graph: https://www.w3.org/TR/ldp/#fig-ldpc-types
+    the `LDP Resource <https://www.w3.org/TR/ldp/#ldpr-resource>`__
+    specifications, according to their `inheritance graph
+    <https://www.w3.org/TR/ldp/#fig-ldpc-types>`__.
 
 
     **Note**: Even though LdpNr (which is a subclass of Ldpr) handles binary
     **Note**: Even though LdpNr (which is a subclass of Ldpr) handles binary
     files, it still has an RDF representation in the triplestore. Hence, some
     files, it still has an RDF representation in the triplestore. Hence, some
@@ -145,12 +143,26 @@ class Ldpr(metaclass=ABCMeta):
     @property
     @property
     def imr(self):
     def imr(self):
         """
         """
-        Extract an in-memory resource from the graph store.
+        In-Memory Resource.
 
 
-        If the resource is not stored (yet), a `ResourceNotExistsError` is
-        raised.
+        This is a copy of the resource extracted from the graph store. It is a
+        graph resource whose identifier is the URI of the resource.
 
 
-        :rtype: rdflib.Resource
+        >>> rsrc = rsrc_api.get('/')
+        >>> rsrc.imr.identifier
+        rdflib.term.URIRef('info:fcres/')
+        >>> rsrc.imr.value(rsrc.imr.identifier, nsc['fcrepo'].lastModified)
+        rdflib.term.Literal(
+            '2018-04-03T05:20:33.774746+00:00',
+            datatype=rdflib.term.URIRef(
+                'http://www.w3.org/2001/XMLSchema#dateTime'))
+
+        The IMR can be read and manipulated, as well as used to
+        update the stored resource.
+
+        :rtype: rdflib.Graph
+        :raise lakesuperior.exceptions.ResourceNotExistsError: If the resource
+            is not stored (yet).
         """
         """
         if not hasattr(self, '_imr'):
         if not hasattr(self, '_imr'):
             if hasattr(self, '_imr_options'):
             if hasattr(self, '_imr_options'):
@@ -162,7 +174,7 @@ class Ldpr(metaclass=ABCMeta):
             else:
             else:
                 imr_options = {}
                 imr_options = {}
             options = dict(imr_options, strict=True)
             options = dict(imr_options, strict=True)
-            self._imr = rdfly.extract_imr(self.uid, **options)
+            self._imr = rdfly.get_imr(self.uid, **options)
 
 
         return self._imr
         return self._imr
 
 
@@ -175,11 +187,8 @@ class Ldpr(metaclass=ABCMeta):
         :param v: New set of triples to populate the IMR with.
         :param v: New set of triples to populate the IMR with.
         :type v: set or rdflib.Graph
         :type v: set or rdflib.Graph
         """
         """
-        if isinstance(v, Resource):
-            v = v.graph
-        self._imr = Resource(Graph(), self.uri)
-        gr = self._imr.graph
-        gr += v
+        self._imr = Graph(identifier=self.uri)
+        self._imr += v
 
 
 
 
     @imr.deleter
     @imr.deleter
@@ -224,7 +233,7 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         out_gr = Graph(identifier=self.uri)
         out_gr = Graph(identifier=self.uri)
 
 
-        for t in self.imr.graph:
+        for t in self.imr:
             if (
             if (
                 # Exclude digest hash and version information.
                 # Exclude digest hash and version information.
                 t[1] not in {
                 t[1] not in {
@@ -248,8 +257,7 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         if not hasattr(self, '_version_info'):
         if not hasattr(self, '_version_info'):
             try:
             try:
-                #@ TODO get_version_info should return a graph.
-                self._version_info = rdfly.get_version_info(self.uid).graph
+                self._version_info = rdfly.get_version_info(self.uid)
             except ResourceNotExistsError as e:
             except ResourceNotExistsError as e:
                 self._version_info = Graph(identifier=self.uri)
                 self._version_info = Graph(identifier=self.uri)
 
 
@@ -272,7 +280,7 @@ class Ldpr(metaclass=ABCMeta):
     def is_stored(self):
     def is_stored(self):
         if not hasattr(self, '_is_stored'):
         if not hasattr(self, '_is_stored'):
             if hasattr(self, '_imr'):
             if hasattr(self, '_imr'):
-                self._is_stored = len(self.imr.graph) > 0
+                self._is_stored = len(self.imr) > 0
             else:
             else:
                 self._is_stored = rdfly.ask_rsrc_exists(self.uid)
                 self._is_stored = rdfly.ask_rsrc_exists(self.uid)
 
 
@@ -286,15 +294,15 @@ class Ldpr(metaclass=ABCMeta):
         :rtype: set(rdflib.term.URIRef)
         :rtype: set(rdflib.term.URIRef)
         """
         """
         if not hasattr(self, '_types'):
         if not hasattr(self, '_types'):
-            if len(self.metadata.graph):
+            if len(self.metadata):
                 metadata = self.metadata
                 metadata = self.metadata
             elif getattr(self, 'provided_imr', None) and \
             elif getattr(self, 'provided_imr', None) and \
-                    len(self.provided_imr.graph):
+                    len(self.provided_imr):
                 metadata = self.provided_imr
                 metadata = self.provided_imr
             else:
             else:
                 return set()
                 return set()
 
 
-            self._types = set(metadata.graph[self.uri: RDF.type])
+            self._types = set(metadata[self.uri: RDF.type])
 
 
         return self._types
         return self._types
 
 
@@ -319,12 +327,13 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         out_headers = defaultdict(list)
         out_headers = defaultdict(list)
 
 
-        digest = self.metadata.value(nsc['premis'].hasMessageDigest)
+        digest = self.metadata.value(self.uri, nsc['premis'].hasMessageDigest)
         if digest:
         if digest:
             etag = digest.identifier.split(':')[-1]
             etag = digest.identifier.split(':')[-1]
             out_headers['ETag'] = 'W/"{}"'.format(etag),
             out_headers['ETag'] = 'W/"{}"'.format(etag),
 
 
-        last_updated_term = self.metadata.value(nsc['fcrepo'].lastModified)
+        last_updated_term = self.metadata.value(
+            self.uri, nsc['fcrepo'].lastModified)
         if last_updated_term:
         if last_updated_term:
             out_headers['Last-Modified'] = arrow.get(last_updated_term)\
             out_headers['Last-Modified'] = arrow.get(last_updated_term)\
                 .format('ddd, D MMM YYYY HH:mm:ss Z')
                 .format('ddd, D MMM YYYY HH:mm:ss Z')
@@ -340,7 +349,7 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         Get a version by label.
         Get a version by label.
         """
         """
-        return rdfly.extract_imr(self.uid, ver_uid, **kwargs).graph
+        return rdfly.get_imr(self.uid, ver_uid, **kwargs)
 
 
 
 
     def create_or_replace(self, create_only=False):
     def create_or_replace(self, create_only=False):
@@ -365,15 +374,15 @@ class Ldpr(metaclass=ABCMeta):
         remove_trp = {
         remove_trp = {
             (self.uri, pred, None) for pred in self.delete_preds_on_replace}
             (self.uri, pred, None) for pred in self.delete_preds_on_replace}
         add_trp = (
         add_trp = (
-            set(self.provided_imr.graph) |
+            set(self.provided_imr) |
             self._containment_rel(create))
             self._containment_rel(create))
 
 
-        self._modify_rsrc(ev_type, remove_trp, add_trp)
-        new_gr = Graph()
+        self.modify(ev_type, remove_trp, add_trp)
+        new_gr = Graph(identifier=self.uri)
         for trp in add_trp:
         for trp in add_trp:
             new_gr.add(trp)
             new_gr.add(trp)
 
 
-        self.imr = new_gr.resource(self.uri)
+        self.imr = new_gr
 
 
         return ev_type
         return ev_type
 
 
@@ -393,7 +402,7 @@ class Ldpr(metaclass=ABCMeta):
         self.create_rsrc_snapshot(uuid4())
         self.create_rsrc_snapshot(uuid4())
 
 
         remove_trp = {
         remove_trp = {
-            trp for trp in self.imr.graph
+            trp for trp in self.imr
             if trp[1] != nsc['fcrepo'].hasVersion}
             if trp[1] != nsc['fcrepo'].hasVersion}
 
 
         if tstone_pointer:
         if tstone_pointer:
@@ -405,15 +414,15 @@ class Ldpr(metaclass=ABCMeta):
                 (self.uri, nsc['fcrepo'].created, env.timestamp_term),
                 (self.uri, nsc['fcrepo'].created, env.timestamp_term),
             }
             }
 
 
-        self._modify_rsrc(RES_DELETED, remove_trp, add_trp)
+        self.modify(RES_DELETED, remove_trp, add_trp)
 
 
         if inbound:
         if inbound:
-            for ib_rsrc_uri in self.imr.graph.subjects(None, self.uri):
+            for ib_rsrc_uri in self.imr.subjects(None, self.uri):
                 remove_trp = {(ib_rsrc_uri, None, self.uri)}
                 remove_trp = {(ib_rsrc_uri, None, self.uri)}
                 ib_rsrc = Ldpr(ib_rsrc_uri)
                 ib_rsrc = Ldpr(ib_rsrc_uri)
                 # To preserve inbound links in history, create a snapshot
                 # To preserve inbound links in history, create a snapshot
                 ib_rsrc.create_rsrc_snapshot(uuid4())
                 ib_rsrc.create_rsrc_snapshot(uuid4())
-                ib_rsrc._modify_rsrc(RES_UPDATED, remove_trp)
+                ib_rsrc.modify(RES_UPDATED, remove_trp)
 
 
         return RES_DELETED
         return RES_DELETED
 
 
@@ -444,7 +453,7 @@ class Ldpr(metaclass=ABCMeta):
         ver_uid = '{}/{}'.format(vers_uid, ver_uid)
         ver_uid = '{}/{}'.format(vers_uid, ver_uid)
         ver_uri = nsc['fcres'][ver_uid]
         ver_uri = nsc['fcres'][ver_uid]
         ver_add_gr.add((ver_uri, RDF.type, nsc['fcrepo'].Version))
         ver_add_gr.add((ver_uri, RDF.type, nsc['fcrepo'].Version))
-        for t in self.imr.graph:
+        for t in self.imr:
             if (
             if (
                 t[1] == RDF.type and t[2] in {
                 t[1] == RDF.type and t[2] in {
                     nsc['fcrepo'].Binary,
                     nsc['fcrepo'].Binary,
@@ -472,7 +481,7 @@ class Ldpr(metaclass=ABCMeta):
             (self.uri, nsc['fcrepo'].hasVersion, ver_uri),
             (self.uri, nsc['fcrepo'].hasVersion, ver_uri),
             (self.uri, nsc['fcrepo'].hasVersions, nsc['fcres'][vers_uid]),
             (self.uri, nsc['fcrepo'].hasVersions, nsc['fcres'][vers_uid]),
         }
         }
-        self._modify_rsrc(RES_UPDATED, add_trp=rsrc_add_gr)
+        self.modify(RES_UPDATED, add_trp=rsrc_add_gr)
 
 
         return ver_uid
         return ver_uid
 
 
@@ -483,9 +492,9 @@ class Ldpr(metaclass=ABCMeta):
 
 
         @EXPERIMENTAL
         @EXPERIMENTAL
         """
         """
-        tstone_trp = set(rdfly.extract_imr(self.uid, strict=False).graph)
+        tstone_trp = set(rdfly.get_imr(self.uid, strict=False))
 
 
-        ver_rsp = self.version_info.graph.query('''
+        ver_rsp = self.version_info.query('''
         SELECT ?uid {
         SELECT ?uid {
           ?latest fcrepo:hasVersionLabel ?uid ;
           ?latest fcrepo:hasVersionLabel ?uid ;
             fcrepo:created ?ts .
             fcrepo:created ?ts .
@@ -494,7 +503,7 @@ class Ldpr(metaclass=ABCMeta):
         LIMIT 1
         LIMIT 1
         ''')
         ''')
         ver_uid = str(ver_rsp.bindings[0]['uid'])
         ver_uid = str(ver_rsp.bindings[0]['uid'])
-        ver_trp = set(rdfly.get_metadata(self.uid, ver_uid).graph)
+        ver_trp = set(rdfly.get_metadata(self.uid, ver_uid))
 
 
         laz_gr = Graph()
         laz_gr = Graph()
         for t in ver_trp:
         for t in ver_trp:
@@ -509,7 +518,7 @@ class Ldpr(metaclass=ABCMeta):
             laz_gr.add((self.uri, RDF.type, nsc['fcrepo'].Container))
             laz_gr.add((self.uri, RDF.type, nsc['fcrepo'].Container))
 
 
         laz_set = set(laz_gr) | self._containment_rel()
         laz_set = set(laz_gr) | self._containment_rel()
-        self._modify_rsrc(RES_CREATED, tstone_trp, laz_set)
+        self.modify(RES_CREATED, tstone_trp, laz_set)
 
 
         return self.uri
         return self.uri
 
 
@@ -544,19 +553,112 @@ class Ldpr(metaclass=ABCMeta):
         if backup:
         if backup:
             self.create_version()
             self.create_version()
 
 
-        ver_gr = rdfly.extract_imr(
+        ver_gr = rdfly.get_imr(
             self.uid, ver_uid=ver_uid, incl_children=False)
             self.uid, ver_uid=ver_uid, incl_children=False)
-        self.provided_imr = Resource(Graph(), self.uri)
+        self.provided_imr = Graph(identifier=self.uri)
 
 
-        for t in ver_gr.graph:
+        for t in ver_gr:
             if not self._is_trp_managed(t):
             if not self._is_trp_managed(t):
-                self.provided_imr.add(t[1], t[2])
+                self.provided_imr.add((self.uri, t[1], t[2]))
             # @TODO Check individual objects: if they are repo-managed URIs
             # @TODO Check individual objects: if they are repo-managed URIs
             # and not existing or tombstones, they are not added.
             # and not existing or tombstones, they are not added.
 
 
         return self.create_or_replace(create_only=False)
         return self.create_or_replace(create_only=False)
 
 
 
 
+    def check_mgd_terms(self, trp):
+        """
+        Check whether server-managed terms are in a RDF payload.
+
+        :param rdflib.Graph trp: The graph to validate.
+        """
+        subjects = {t[0] for t in trp}
+        offending_subjects = subjects & srv_mgd_subjects
+        if offending_subjects:
+            if self.handling == 'strict':
+                raise ServerManagedTermError(offending_subjects, 's')
+            else:
+                for s in offending_subjects:
+                    logger.info('Removing offending subj: {}'.format(s))
+                    for t in trp:
+                        if t[0] == s:
+                            trp.remove(t)
+
+        predicates = {t[1] for t in trp}
+        offending_predicates = predicates & srv_mgd_predicates
+        # Allow some predicates if the resource is being created.
+        if offending_predicates:
+            if self.handling == 'strict':
+                raise ServerManagedTermError(offending_predicates, 'p')
+            else:
+                for p in offending_predicates:
+                    logger.info('Removing offending pred: {}'.format(p))
+                    for t in trp:
+                        if t[1] == p:
+                            trp.remove(t)
+
+        types = {t[2] for t in trp if t[1] == RDF.type}
+        offending_types = types & srv_mgd_types
+        if not self.is_stored:
+            offending_types -= self.smt_allow_on_create
+        if offending_types:
+            if self.handling == 'strict':
+                raise ServerManagedTermError(offending_types, 't')
+            else:
+                for to in offending_types:
+                    logger.info('Removing offending type: {}'.format(to))
+                    for t in trp:
+                        if t[1] == RDF.type and t[2] == to:
+                            trp.remove(t)
+
+        #logger.debug('Sanitized graph: {}'.format(trp.serialize(
+        #    format='turtle').decode('utf-8')))
+        return trp
+
+
+    def sparql_delta(self, q):
+        """
+        Calculate the delta obtained by a SPARQL Update operation.
+
+        This is a critical component of the SPARQL update prcess and does a
+        couple of things:
+
+        1. It ensures that no resources outside of the subject of the request
+        are modified (e.g. by variable subjects)
+        2. It verifies that none of the terms being modified is server managed.
+
+        This method extracts an in-memory copy of the resource and performs the
+        query on that once it has checked if any of the server managed terms is
+        in the delta. If it is, it raises an exception.
+
+        NOTE: This only checks if a server-managed term is effectively being
+        modified. If a server-managed term is present in the query but does not
+        cause any change in the updated resource, no error is raised.
+
+        :rtype: tuple(rdflib.Graph)
+        :return: Remove and add graphs. These can be used
+        with ``BaseStoreLayout.update_resource`` and/or recorded as separate
+        events in a provenance tracking system.
+        """
+        logger.debug('Provided SPARQL query: {}'.format(q))
+        pre_gr = self.imr
+
+        post_gr = pre_gr | Graph()
+        post_gr.update(q)
+
+        remove_gr, add_gr = self._dedup_deltas(pre_gr, post_gr)
+
+        #logger.debug('Removing: {}'.format(
+        #    remove_gr.serialize(format='turtle').decode('utf8')))
+        #logger.debug('Adding: {}'.format(
+        #    add_gr.serialize(format='turtle').decode('utf8')))
+
+        remove_trp = self.check_mgd_terms(set(remove_gr))
+        add_trp = self.check_mgd_terms(set(add_gr))
+
+        return remove_trp, add_trp
+
+
     ## PROTECTED METHODS ##
     ## PROTECTED METHODS ##
 
 
     def _is_trp_managed(self, t):
     def _is_trp_managed(self, t):
@@ -571,7 +673,7 @@ class Ldpr(metaclass=ABCMeta):
             t[1] == RDF.type and t[2] in srv_mgd_types)
             t[1] == RDF.type and t[2] in srv_mgd_types)
 
 
 
 
-    def _modify_rsrc(
+    def modify(
             self, ev_type, remove_trp=set(), add_trp=set()):
             self, ev_type, remove_trp=set(), add_trp=set()):
         """
         """
         Low-level method to modify a graph for a single resource.
         Low-level method to modify a graph for a single resource.
@@ -606,7 +708,7 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         try:
         try:
             rsrc_type = tuple(str(t) for t in self.types)
             rsrc_type = tuple(str(t) for t in self.types)
-            actor = self.metadata.value(nsc['fcrepo'].createdBy)
+            actor = self.metadata.value(self.uri, nsc['fcrepo'].createdBy)
         except (ResourceNotExistsError, TombstoneError):
         except (ResourceNotExistsError, TombstoneError):
             rsrc_type = ()
             rsrc_type = ()
             actor = None
             actor = None
@@ -625,10 +727,18 @@ class Ldpr(metaclass=ABCMeta):
 
 
 
 
     def _check_ref_int(self, config):
     def _check_ref_int(self, config):
-        gr = self.provided_imr.graph
+        """
+        Check referential integrity of a resource.
 
 
-        for o in gr.objects():
-            if isinstance(o, URIRef) and str(o).startswith(nsc['fcres']):
+        :param str config: If set to ``strict``, a
+           :class:`lakesuperior.exceptions.RefIntViolationError` is raised.
+           Otherwise, the violation is simply logged.
+        """
+        for o in self.provided_imr.objects():
+            if(
+                    isinstance(o, URIRef) and
+                    str(o).startswith(nsc['fcres']) and
+                    urldefrag(o).url.rstrip('/') != str(self.uri)):
                 obj_uid = rdfly.uri_to_uid(o)
                 obj_uid = rdfly.uri_to_uid(o)
                 if not rdfly.ask_rsrc_exists(obj_uid):
                 if not rdfly.ask_rsrc_exists(obj_uid):
                     if config == 'strict':
                     if config == 'strict':
@@ -637,48 +747,7 @@ class Ldpr(metaclass=ABCMeta):
                         logger.info(
                         logger.info(
                             'Removing link to non-existent repo resource: {}'
                             'Removing link to non-existent repo resource: {}'
                             .format(obj_uid))
                             .format(obj_uid))
-                        gr.remove((None, None, o))
-
-
-    def _check_mgd_terms(self, gr):
-        """
-        Check whether server-managed terms are in a RDF payload.
-
-        :param rdflib.Graph gr: The graph to validate.
-        """
-        offending_subjects = set(gr.subjects()) & srv_mgd_subjects
-        if offending_subjects:
-            if self.handling == 'strict':
-                raise ServerManagedTermError(offending_subjects, 's')
-            else:
-                for s in offending_subjects:
-                    logger.info('Removing offending subj: {}'.format(s))
-                    gr.remove((s, None, None))
-
-        offending_predicates = set(gr.predicates()) & srv_mgd_predicates
-        # Allow some predicates if the resource is being created.
-        if offending_predicates:
-            if self.handling == 'strict':
-                raise ServerManagedTermError(offending_predicates, 'p')
-            else:
-                for p in offending_predicates:
-                    logger.info('Removing offending pred: {}'.format(p))
-                    gr.remove((None, p, None))
-
-        offending_types = set(gr.objects(predicate=RDF.type)) & srv_mgd_types
-        if not self.is_stored:
-            offending_types -= self.smt_allow_on_create
-        if offending_types:
-            if self.handling == 'strict':
-                raise ServerManagedTermError(offending_types, 't')
-            else:
-                for t in offending_types:
-                    logger.info('Removing offending type: {}'.format(t))
-                    gr.remove((None, RDF.type, t))
-
-        #logger.debug('Sanitized graph: {}'.format(gr.serialize(
-        #    format='turtle').decode('utf-8')))
-        return gr
+                        self.provided_imr.remove((None, None, o))
 
 
 
 
     def _add_srv_mgd_triples(self, create=False):
     def _add_srv_mgd_triples(self, create=False):
@@ -689,28 +758,32 @@ class Ldpr(metaclass=ABCMeta):
         """
         """
         # Base LDP types.
         # Base LDP types.
         for t in self.base_types:
         for t in self.base_types:
-            self.provided_imr.add(RDF.type, t)
+            self.provided_imr.add((self.uri, RDF.type, t))
 
 
         # Message digest.
         # Message digest.
-        cksum = self.tbox.rdf_cksum(self.provided_imr.graph)
-        self.provided_imr.set(
-            nsc['premis'].hasMessageDigest,
-            URIRef('urn:sha1:{}'.format(cksum)))
+        cksum = self.tbox.rdf_cksum(self.provided_imr)
+        self.provided_imr.set((
+            self.uri, nsc['premis'].hasMessageDigest,
+            URIRef('urn:sha1:{}'.format(cksum))))
 
 
         # Create and modify timestamp.
         # Create and modify timestamp.
         if create:
         if create:
-            self.provided_imr.set(nsc['fcrepo'].created, env.timestamp_term)
-            self.provided_imr.set(nsc['fcrepo'].createdBy, self.DEFAULT_USER)
+            self.provided_imr.set((
+                self.uri, nsc['fcrepo'].created, env.timestamp_term))
+            self.provided_imr.set((
+                self.uri, nsc['fcrepo'].createdBy, self.DEFAULT_USER))
         else:
         else:
-            self.provided_imr.set(
-                nsc['fcrepo'].created, self.metadata.value(
-                    nsc['fcrepo'].created))
-            self.provided_imr.set(
-                nsc['fcrepo'].createdBy, self.metadata.value(
-                    nsc['fcrepo'].createdBy))
+            self.provided_imr.set((
+                self.uri, nsc['fcrepo'].created, self.metadata.value(
+                    self.uri, nsc['fcrepo'].created)))
+            self.provided_imr.set((
+                self.uri, nsc['fcrepo'].createdBy, self.metadata.value(
+                    self.uri, nsc['fcrepo'].createdBy)))
 
 
-        self.provided_imr.set(nsc['fcrepo'].lastModified, env.timestamp_term)
-        self.provided_imr.set(nsc['fcrepo'].lastModifiedBy, self.DEFAULT_USER)
+        self.provided_imr.set((
+            self.uri, nsc['fcrepo'].lastModified, env.timestamp_term))
+        self.provided_imr.set((
+            self.uri, nsc['fcrepo'].lastModifiedBy, self.DEFAULT_USER))
 
 
 
 
     def _containment_rel(self, create):
     def _containment_rel(self, create):
@@ -763,7 +836,7 @@ class Ldpr(metaclass=ABCMeta):
             add_gr = Graph()
             add_gr = Graph()
             add_gr.add(
             add_gr.add(
                 (nsc['fcres'][parent_uid], nsc['ldp'].contains, self.uri))
                 (nsc['fcres'][parent_uid], nsc['ldp'].contains, self.uri))
-            parent_rsrc._modify_rsrc(RES_UPDATED, add_trp=add_gr)
+            parent_rsrc.modify(RES_UPDATED, add_trp=add_gr)
 
 
         # Direct or indirect container relationship.
         # Direct or indirect container relationship.
         return self._add_ldp_dc_ic_rel(parent_rsrc)
         return self._add_ldp_dc_ic_rel(parent_rsrc)
@@ -790,7 +863,7 @@ class Ldpr(metaclass=ABCMeta):
 
 
         :param rdflib.resource.Resouce cont_rsrc:  The container resource.
         :param rdflib.resource.Resouce cont_rsrc:  The container resource.
         """
         """
-        cont_p = set(cont_rsrc.metadata.graph.predicates())
+        cont_p = set(cont_rsrc.metadata.predicates())
 
 
         logger.info('Checking direct or indirect containment.')
         logger.info('Checking direct or indirect containment.')
         logger.debug('Parent predicates: {}'.format(cont_p))
         logger.debug('Parent predicates: {}'.format(cont_p))
@@ -800,8 +873,8 @@ class Ldpr(metaclass=ABCMeta):
         if self.MBR_RSRC_URI in cont_p and self.MBR_REL_URI in cont_p:
         if self.MBR_RSRC_URI in cont_p and self.MBR_REL_URI in cont_p:
             from lakesuperior.model.ldp_factory import LdpFactory
             from lakesuperior.model.ldp_factory import LdpFactory
 
 
-            s = cont_rsrc.metadata.value(self.MBR_RSRC_URI).identifier
-            p = cont_rsrc.metadata.value(self.MBR_REL_URI).identifier
+            s = cont_rsrc.metadata.value(cont_rsrc.uri, self.MBR_RSRC_URI)
+            p = cont_rsrc.metadata.value(cont_rsrc_uri, self.MBR_REL_URI)
 
 
             if cont_rsrc.metadata[RDF.type: nsc['ldp'].DirectContainer]:
             if cont_rsrc.metadata[RDF.type: nsc['ldp'].DirectContainer]:
                 logger.info('Parent is a direct container.')
                 logger.info('Parent is a direct container.')
@@ -815,68 +888,12 @@ class Ldpr(metaclass=ABCMeta):
                     self.INS_CNT_REL_URI in cont_p):
                     self.INS_CNT_REL_URI in cont_p):
                 logger.info('Parent is an indirect container.')
                 logger.info('Parent is an indirect container.')
                 cont_rel_uri = cont_rsrc.metadata.value(
                 cont_rel_uri = cont_rsrc.metadata.value(
-                    self.INS_CNT_REL_URI).identifier
-                o = self.provided_imr.value(cont_rel_uri).identifier
+                    cont_rsrc.uri, self.INS_CNT_REL_URI)
+                o = self.provided_imr.value(self.uri, cont_rel_uri)
                 logger.debug('Target URI: {}'.format(o))
                 logger.debug('Target URI: {}'.format(o))
                 logger.debug('Creating IC triples.')
                 logger.debug('Creating IC triples.')
 
 
             target_rsrc = LdpFactory.from_stored(rdfly.uri_to_uid(s))
             target_rsrc = LdpFactory.from_stored(rdfly.uri_to_uid(s))
-            target_rsrc._modify_rsrc(RES_UPDATED, add_trp={(s, p, o)})
+            target_rsrc.modify(RES_UPDATED, add_trp={(s, p, o)})
 
 
         return add_trp
         return add_trp
-
-
-    def sparql_update(self, update_str):
-        """
-        Apply a SPARQL update to a resource.
-
-        :param str update_str: SPARQL-Update string. All URIs are local.
-        """
-        # FCREPO does that and Hyrax requires it.
-        self.handling = 'lenient'
-        delta = self._sparql_delta(update_str)
-
-        self._modify_rsrc(RES_UPDATED, *delta)
-
-
-    def _sparql_delta(self, q):
-        """
-        Calculate the delta obtained by a SPARQL Update operation.
-
-        This is a critical component of the SPARQL update prcess and does a
-        couple of things:
-
-        1. It ensures that no resources outside of the subject of the request
-        are modified (e.g. by variable subjects)
-        2. It verifies that none of the terms being modified is server managed.
-
-        This method extracts an in-memory copy of the resource and performs the
-        query on that once it has checked if any of the server managed terms is
-        in the delta. If it is, it raises an exception.
-
-        NOTE: This only checks if a server-managed term is effectively being
-        modified. If a server-managed term is present in the query but does not
-        cause any change in the updated resource, no error is raised.
-
-        :rtype: tuple(rdflib.Graph)
-        :return: Remove and add graphs. These can be used
-        with ``BaseStoreLayout.update_resource`` and/or recorded as separate
-        events in a provenance tracking system.
-        """
-        logger.debug('Provided SPARQL query: {}'.format(q))
-        pre_gr = self.imr.graph
-
-        post_gr = pre_gr | Graph()
-        post_gr.update(q)
-
-        remove_gr, add_gr = self._dedup_deltas(pre_gr, post_gr)
-
-        #logger.debug('Removing: {}'.format(
-        #    remove_gr.serialize(format='turtle').decode('utf8')))
-        #logger.debug('Adding: {}'.format(
-        #    add_gr.serialize(format='turtle').decode('utf8')))
-
-        remove_gr = self._check_mgd_terms(remove_gr)
-        add_gr = self._check_mgd_terms(add_gr)
-
-        return set(remove_gr), set(add_gr)

+ 29 - 30
lakesuperior/store/ldp_rs/rsrc_centric_layout.py

@@ -221,11 +221,11 @@ class RsrcCentricLayout:
         return self.ds.query(qry_str)
         return self.ds.query(qry_str)
 
 
 
 
-    def extract_imr(
+    def get_imr(
                 self, uid, ver_uid=None, strict=True, incl_inbound=False,
                 self, uid, ver_uid=None, strict=True, incl_inbound=False,
                 incl_children=True, embed_children=False, **kwargs):
                 incl_children=True, embed_children=False, **kwargs):
         """
         """
-        See base_rdf_layout.extract_imr.
+        See base_rdf_layout.get_imr.
         """
         """
         if ver_uid:
         if ver_uid:
             uid = self.snapshot_uid(uid, ver_uid)
             uid = self.snapshot_uid(uid, ver_uid)
@@ -241,22 +241,20 @@ class RsrcCentricLayout:
                 for gr in graphs]
                 for gr in graphs]
         resultset = set(chain.from_iterable(rsrc_graphs))
         resultset = set(chain.from_iterable(rsrc_graphs))
 
 
-        gr = Graph()
-        gr += resultset
+        imr = Graph(identifier=nsc['fcres'][uid])
+        imr += resultset
 
 
         # Include inbound relationships.
         # Include inbound relationships.
-        if incl_inbound and len(gr):
-            gr += self.get_inbound_rel(nsc['fcres'][uid])
+        if incl_inbound and len(imr):
+            imr += self.get_inbound_rel(nsc['fcres'][uid])
 
 
         #logger.debug('Found resource: {}'.format(
         #logger.debug('Found resource: {}'.format(
-        #        gr.serialize(format='turtle').decode('utf-8')))
-
-        rsrc = Resource(gr, nsc['fcres'][uid])
+        #        imr.serialize(format='turtle').decode('utf-8')))
 
 
         if strict:
         if strict:
-            self._check_rsrc_status(rsrc)
+            self._check_rsrc_status(imr)
 
 
-        return rsrc
+        return imr
 
 
 
 
     def ask_rsrc_exists(self, uid):
     def ask_rsrc_exists(self, uid):
@@ -276,14 +274,14 @@ class RsrcCentricLayout:
         logger.debug('Getting metadata for: {}'.format(uid))
         logger.debug('Getting metadata for: {}'.format(uid))
         if ver_uid:
         if ver_uid:
             uid = self.snapshot_uid(uid, ver_uid)
             uid = self.snapshot_uid(uid, ver_uid)
-        gr = self.ds.graph(nsc['fcadmin'][uid]) | Graph()
         uri = nsc['fcres'][uid]
         uri = nsc['fcres'][uid]
+        gr = Graph(identifier=uri)
+        gr += self.ds.graph(nsc['fcadmin'][uid])
 
 
-        rsrc = Resource(gr, uri)
         if strict:
         if strict:
-            self._check_rsrc_status(rsrc)
+            self._check_rsrc_status(gr)
 
 
-        return rsrc
+        return gr
 
 
 
 
     def get_user_data(self, uid):
     def get_user_data(self, uid):
@@ -295,9 +293,10 @@ class RsrcCentricLayout:
         # *TODO* This only works as long as there is only one user-provided
         # *TODO* This only works as long as there is only one user-provided
         # graph. If multiple user-provided graphs will be supported, this
         # graph. If multiple user-provided graphs will be supported, this
         # should use another query to get all of them.
         # should use another query to get all of them.
-        userdata_gr = self.ds.graph(nsc['fcmain'][uid])
+        userdata_gr = Graph(identifier=nsc['fcres'][uid])
+        userdata_gr += self.ds.graph(nsc['fcmain'][uid])
 
 
-        return userdata_gr | Graph()
+        return userdata_gr
 
 
 
 
     def get_version_info(self, uid, strict=True):
     def get_version_info(self, uid, strict=True):
@@ -331,12 +330,12 @@ class RsrcCentricLayout:
             'ag': nsc['fcadmin'][uid],
             'ag': nsc['fcadmin'][uid],
             'hg': HIST_GR_URI,
             'hg': HIST_GR_URI,
             's': nsc['fcres'][uid]})
             's': nsc['fcres'][uid]})
-        rsrc = Resource(gr, nsc['fcres'][uid])
-        # TODO Should return a graph.
+        ver_info_gr = Graph(identifier=nsc['fcres'][uid])
+        ver_info_gr += gr
         if strict:
         if strict:
-            self._check_rsrc_status(rsrc)
+            self._check_rsrc_status(ver_info_gr)
 
 
-        return rsrc
+        return ver_info_gr
 
 
 
 
     def get_inbound_rel(self, subj_uri, full_triple=True):
     def get_inbound_rel(self, subj_uri, full_triple=True):
@@ -566,23 +565,23 @@ class RsrcCentricLayout:
 
 
     ## PROTECTED MEMBERS ##
     ## PROTECTED MEMBERS ##
 
 
-    def _check_rsrc_status(self, rsrc):
+    def _check_rsrc_status(self, gr):
         """
         """
         Check if a resource is not existing or if it is a tombstone.
         Check if a resource is not existing or if it is a tombstone.
         """
         """
-        uid = self.uri_to_uid(rsrc.identifier)
-        if not len(rsrc.graph):
+        uid = self.uri_to_uid(gr.identifier)
+        if not len(gr):
             raise ResourceNotExistsError(uid)
             raise ResourceNotExistsError(uid)
 
 
         # Check if resource is a tombstone.
         # Check if resource is a tombstone.
-        if rsrc[RDF.type : nsc['fcsystem'].Tombstone]:
+        if gr[gr.identifier : RDF.type : nsc['fcsystem'].Tombstone]:
             raise TombstoneError(
             raise TombstoneError(
-                    uid, rsrc.value(nsc['fcrepo'].created))
-        elif rsrc.value(nsc['fcsystem'].tombstone):
+                    uid, gr.value(gr.identifier, nsc['fcrepo'].created))
+        elif gr.value(gr.identifier, nsc['fcsystem'].tombstone):
             raise TombstoneError(
             raise TombstoneError(
-                    self.uri_to_uid(
-                        rsrc.value(nsc['fcsystem'].tombstone).identifier),
-                        rsrc.value(nsc['fcrepo'].created))
+                self.uri_to_uid(
+                    gr.value(gr.identifier, nsc['fcsystem'].tombstone)),
+                gr.value(gr.identifier, nsc['fcrepo'].created))
 
 
 
 
     def _parse_construct(self, qry, init_bindings={}):
     def _parse_construct(self, qry, init_bindings={}):

+ 1 - 1
setup.py

@@ -27,7 +27,7 @@ with open(path.join(here, 'README.rst'), encoding='utf-8') as f:
 
 
 setup(
 setup(
     name='lakesuperior',
     name='lakesuperior',
-    version='1.0.0a11',
+    version='1.0.0a12',
 
 
     description='A Linked Data Platform repository sever.',
     description='A Linked Data Platform repository sever.',
     long_description=long_description,
     long_description=long_description,

+ 3 - 3
tests/endpoints/test_ldp.py

@@ -66,7 +66,7 @@ class TestLdp:
         assert put2_resp.status_code == 204
         assert put2_resp.status_code == 204
 
 
         put2_resp = self.client.put(path)
         put2_resp = self.client.put(path)
-        assert put2_resp.status_code == 409
+        assert put2_resp.status_code == 204
 
 
 
 
     def test_put_tree(self, client):
     def test_put_tree(self, client):
@@ -561,7 +561,7 @@ class TestPrefHeader:
         '''
         '''
         Trying to PUT an existing resource should:
         Trying to PUT an existing resource should:
 
 
-        - Return a 409 if the payload is empty
+        - Return a 204 if the payload is empty
         - Return a 204 if the payload is RDF, server-managed triples are
         - Return a 204 if the payload is RDF, server-managed triples are
           included and the 'Prefer' header is set to 'handling=lenient'
           included and the 'Prefer' header is set to 'handling=lenient'
         - Return a 412 (ServerManagedTermError) if the payload is RDF,
         - Return a 412 (ServerManagedTermError) if the payload is RDF,
@@ -571,7 +571,7 @@ class TestPrefHeader:
         path = '/ldp/put_pref_header01'
         path = '/ldp/put_pref_header01'
         assert self.client.put(path).status_code == 201
         assert self.client.put(path).status_code == 201
         assert self.client.get(path).status_code == 200
         assert self.client.get(path).status_code == 200
-        assert self.client.put(path).status_code == 409
+        assert self.client.put(path).status_code == 204
 
 
         # Default handling is strict.
         # Default handling is strict.
         with open('tests/data/rdf_payload_w_srv_mgd_trp.ttl', 'rb') as f:
         with open('tests/data/rdf_payload_w_srv_mgd_trp.ttl', 'rb') as f:

+ 227 - 0
tests/test_resource_api.py

@@ -0,0 +1,227 @@
+import pdb
+import pytest
+
+from io import BytesIO
+from uuid import uuid4
+
+from rdflib import Graph, Literal, URIRef
+
+from lakesuperior.api import resource as rsrc_api
+from lakesuperior.dictionaries.namespaces import ns_collection as nsc
+from lakesuperior.exceptions import (
+        IncompatibleLdpTypeError, InvalidResourceError, ResourceNotExistsError,
+        TombstoneError)
+from lakesuperior.globals import RES_CREATED, RES_UPDATED
+from lakesuperior.model.ldpr import Ldpr
+
+
+@pytest.fixture(scope='module')
+def random_uuid():
+    return str(uuid.uuid4())
+
+
+@pytest.mark.usefixtures('db')
+class TestResourceApi:
+    '''
+    Test interaction with the Resource API.
+    '''
+    def test_nodes_exist(self):
+        """
+        Verify whether nodes exist or not.
+        """
+        assert rsrc_api.exists('/') is True
+        assert rsrc_api.exists('/{}'.format(uuid4())) is False
+
+
+    def test_get_root_node_metadata(self):
+        """
+        Get the root node metadata.
+
+        The ``dcterms:title`` property should NOT be included.
+        """
+        gr = rsrc_api.get_metadata('/')
+        assert isinstance(gr, Graph)
+        assert len(gr) == 9
+        assert gr[gr.identifier : nsc['rdf'].type : nsc['ldp'].Resource ]
+        assert not gr[gr.identifier : nsc['dcterms'].title : "Repository Root"]
+
+
+    def test_get_root_node(self):
+        """
+        Get the root node.
+
+        The ``dcterms:title`` property should be included.
+        """
+        rsrc = rsrc_api.get('/')
+        assert isinstance(rsrc, Ldpr)
+        gr = rsrc.imr
+        assert len(gr) == 10
+        assert gr[gr.identifier : nsc['rdf'].type : nsc['ldp'].Resource ]
+        assert gr[
+            gr.identifier : nsc['dcterms'].title : Literal('Repository Root')]
+
+
+    def test_get_nonexisting_node(self):
+        """
+        Get a non-existing node.
+        """
+        with pytest.raises(ResourceNotExistsError):
+            gr = rsrc_api.get('/{}'.format(uuid4()))
+
+
+    def test_create_ldp_rs(self):
+        """
+        Create an RDF resource (LDP-RS) from a provided graph.
+        """
+        uid = '/rsrc_from_graph'
+        uri = nsc['fcres'][uid]
+        gr = Graph().parse(
+            data='<> a <http://ex.org/type#A> .', format='turtle',
+            publicID=uri)
+        #pdb.set_trace()
+        evt = rsrc_api.create_or_replace(uid, graph=gr)
+
+        rsrc = rsrc_api.get(uid)
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : URIRef('http://ex.org/type#A')]
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['ldp'].RDFSource]
+
+
+    def test_create_ldp_nr(self):
+        """
+        Create a non-RDF resource (LDP-NR).
+        """
+        uid = '/{}'.format(uuid4())
+        data = b'Hello. This is some dummy content.'
+        rsrc_api.create_or_replace(
+                uid, stream=BytesIO(data), mimetype='text/plain')
+
+        rsrc = rsrc_api.get(uid)
+        assert rsrc.content.read() == data
+
+
+    def test_replace_rsrc(self):
+        uid = '/test_replace'
+        uri = nsc['fcres'][uid]
+        gr1 = Graph().parse(
+            data='<> a <http://ex.org/type#A> .', format='turtle',
+            publicID=uri)
+        evt = rsrc_api.create_or_replace(uid, graph=gr1)
+        assert evt == RES_CREATED
+
+        rsrc = rsrc_api.get(uid)
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : URIRef('http://ex.org/type#A')]
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['ldp'].RDFSource]
+
+        gr2 = Graph().parse(
+            data='<> a <http://ex.org/type#B> .', format='turtle',
+            publicID=uri)
+        #pdb.set_trace()
+        evt = rsrc_api.create_or_replace(uid, graph=gr2)
+        assert evt == RES_UPDATED
+
+        rsrc = rsrc_api.get(uid)
+        assert not rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : URIRef('http://ex.org/type#A')]
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : URIRef('http://ex.org/type#B')]
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['ldp'].RDFSource]
+
+
+    def test_replace_incompatible_type(self):
+        """
+        Verify replacing resources with incompatible type.
+
+        Replacing a LDP-NR with a LDP-RS, or vice versa, should fail.
+        """
+        uid_rs = '/test_incomp_rs'
+        uid_nr = '/test_incomp_nr'
+        data = b'mock binary content'
+        gr = Graph().parse(
+            data='<> a <http://ex.org/type#A> .', format='turtle',
+            publicID=nsc['fcres'][uid_rs])
+
+        rsrc_api.create_or_replace(uid_rs, graph=gr)
+        rsrc_api.create_or_replace(
+            uid_nr, stream=BytesIO(data), mimetype='text/plain')
+
+        with pytest.raises(IncompatibleLdpTypeError):
+            rsrc_api.create_or_replace(uid_nr, graph=gr)
+
+        with pytest.raises(IncompatibleLdpTypeError):
+            rsrc_api.create_or_replace(
+                uid_rs, stream=BytesIO(data), mimetype='text/plain')
+
+        with pytest.raises(IncompatibleLdpTypeError):
+            rsrc_api.create_or_replace(uid_nr)
+
+
+    def test_delta_update(self):
+        """
+        Update a resource with two sets of add and remove triples.
+        """
+        uid = '/test_delta_patch'
+        uri = nsc['fcres'][uid]
+        init_trp = {
+            (URIRef(uri), nsc['rdf'].type, nsc['foaf'].Person),
+            (URIRef(uri), nsc['foaf'].name, Literal('Joe Bob')),
+        }
+        remove_trp = {
+            (URIRef(uri), nsc['rdf'].type, nsc['foaf'].Person),
+        }
+        add_trp = {
+            (URIRef(uri), nsc['rdf'].type, nsc['foaf'].Organization),
+        }
+
+        gr = Graph()
+        gr += init_trp
+        rsrc_api.create_or_replace(uid, graph=gr)
+        rsrc_api.update_delta(uid, remove_trp, add_trp)
+        rsrc = rsrc_api.get(uid)
+
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['foaf'].Organization]
+        assert rsrc.imr[rsrc.uri : nsc['foaf'].name : Literal('Joe Bob')]
+        assert not rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['foaf'].Person]
+
+
+    def test_delta_update_wildcard(self):
+        """
+        Update a resource using wildcard modifiers.
+        """
+        uid = '/test_delta_patch_wc'
+        uri = nsc['fcres'][uid]
+        init_trp = {
+            (URIRef(uri), nsc['rdf'].type, nsc['foaf'].Person),
+            (URIRef(uri), nsc['foaf'].name, Literal('Joe Bob')),
+            (URIRef(uri), nsc['foaf'].name, Literal('Joe Average Bob')),
+            (URIRef(uri), nsc['foaf'].name, Literal('Joe 12oz Bob')),
+        }
+        remove_trp = {
+            (URIRef(uri), nsc['foaf'].name, None),
+        }
+        add_trp = {
+            (URIRef(uri), nsc['foaf'].name, Literal('Joan Knob')),
+        }
+
+        gr = Graph()
+        gr += init_trp
+        rsrc_api.create_or_replace(uid, graph=gr)
+        rsrc_api.update_delta(uid, remove_trp, add_trp)
+        rsrc = rsrc_api.get(uid)
+
+        assert rsrc.imr[
+                rsrc.uri : nsc['rdf'].type : nsc['foaf'].Person]
+        assert rsrc.imr[rsrc.uri : nsc['foaf'].name : Literal('Joan Knob')]
+        assert not rsrc.imr[rsrc.uri : nsc['foaf'].name : Literal('Joe Bob')]
+        assert not rsrc.imr[
+            rsrc.uri : nsc['foaf'].name : Literal('Joe Average Bob')]
+        assert not rsrc.imr[
+            rsrc.uri : nsc['foaf'].name : Literal('Joe 12oz Bob')]
+
+