Jump to content
UltiMaker Community of 3D Printing Experts

Get reference pictures from preview to learn a neural network


Hanshogeland

Recommended Posts

Posted · Get reference pictures from preview to learn a neural network

Hi,

I’m working on a project where I have trained a neural network model ”CNN” with data from the preview in Cura. The CNN are capable of predicting the outcom of the print - pass failed for each layer, and I’d like to write a plugin to Cura where I’d get pictures from a couple of angles from each layer in the preview. These pictures will then be sent via SSH to the SBC ”Jetson Nano” having duet 3 board connected. This will enable the possibillty to verify the print on the fly. So before I dive into the Plugin writing I’m wondering if it’s even possible to generate pictures using code in a plugin?

5EE35D9A-3D0E-4D12-A2FA-1179FFBDABB9.jpeg

689BAC0B-EA88-4376-9CC6-9F9A28A1CA08.jpeg

1B555031-BB37-4212-BF80-0058EFB3C278.png

  • Link to post
    Share on other sites

    Posted · Get reference pictures from preview to learn a neural network

    Interesting idea! I'd love to hear about your progress. if you need any specific advice on how to make the plugin, feel free to contact me.

    • Like 1
    Link to post
    Share on other sites

    • 3 weeks later...
    Posted · Get reference pictures from preview to learn a neural network

    Hi, Thx for you answers.

    I have started to try to at least get the cura plugin to start and I´ve gotten so far that I cover the basic Menu item, however I copied the code from the UFP writer and tried to get the method rolling, but I´m stuck at understanding how to get the stream into the method, since I want to to run the script from the menu item I guess I need to add something by the start to access the stream. As you´ll see below I´m pretty novice on this - the out commented Write method need the input stream.

     

    from typing import List
    from typing import cast
    from Charon.VirtualFile import VirtualFile #To open UFP files.
    from Charon.OpenMode import OpenMode #To indicate that we want to write to UFP files.
    from io import StringIO #For converting g-code to bytes.
    from UM.Application import Application
    from UM.Logger import Logger
    from UM.Mesh.MeshWriter import MeshWriter #The writer we need to implement.
    from UM.MimeTypeDatabase import MimeTypeDatabase, MimeType
    from UM.PluginRegistry import PluginRegistry #To get the g-code writer.
    from PyQt5.QtCore import QBuffer
    from cura.Snapshot import Snapshot
    from cura.Utils.Threading import call_on_qt_thread
    from UM.i18n import i18nCatalog
    from UM.Extension import Extension
    from UM.PluginRegistry import PluginRegistry
    from UM.Scene.SceneNode import SceneNode
    from UM.Scene.Selection import Selection
    from UM.Message import Message
    from cura.CuraApplication import CuraApplication
    import os
    
    i18n_catalog = i18nCatalog("CreateTrainingDataPlugin")
    
    class CreateTrainingDataPlugin(Extension, MeshWriter):
        def __init__(self):
            super().__init__()
            self.addMenuItem(i18n_catalog.i18n("Create training data pictures"), self.doExtendedCreateTrainingPics)
            #self.addMenuItem(i18n_catalog.i18n("Write"), self.doExtendedWrite)
    
            self._message = None
            MimeTypeDatabase.addMimeType(
                MimeType(
                    name = "application/x-ufp",
                    comment = "Ultimaker Format Package",
                    suffixes = ["ufp"]
                )
            )
            self._snapshot = None
    
        def doExtendedCreateTrainingPics(self):
            self.doCreateTrainingdata(True)
        
        #@call_on_qt_thread
        #def doExtendedWrite(self):
        #   self.write(stream)
        @call_on_qt_thread
        def doCreateTrainingdata(self, extended_mode):
            self._message = Message(i18n_catalog.i18nc("@info:status", "Creating .PNG pics"), title = i18n_catalog.i18nc("@title", "PNG Pics"))
            self._message.show()
            #self.write()
    
        def _createSnapshot(self, *args):
            # must be called from the main thread because of OpenGL
            Logger.log("d", "Creating thumbnail image...")
            try:
                self._snapshot = Snapshot.snapshot(width = 300, height = 300)
            except Exception:
                Logger.logException("w", "Failed to create snapshot image")
                self._snapshot = None
    
        @call_on_qt_thread
        def write(self, stream, nodes, mode = MeshWriter.OutputMode.BinaryMode):
            archive = VirtualFile()
            archive.openStream(stream, "application/x-ufp", OpenMode.WriteOnly)
    
            #Store the g-code from the scene.
            archive.addContentType(extension = "gcode", mime_type = "text/x-gcode")
            gcode_textio = StringIO() #We have to convert the g-code into bytes.
            gcode_writer = cast(MeshWriter, PluginRegistry.getInstance().getPluginObject("GCodeWriter"))
            success = gcode_writer.write(gcode_textio, None)
            if not success: #Writing the g-code failed. Then I can also not write the gzipped g-code.
                self.setInformation(gcode_writer.getInformation())
                return False
            gcode = archive.getStream("/3D/model.gcode")
            gcode.write(gcode_textio.getvalue().encode("UTF-8"))
            archive.addRelation(virtual_path = "/3D/model.gcode", relation_type = "http://schemas.ultimaker.org/package/2018/relationships/gcode")
    
            self._createSnapshot()
    
            #Store the thumbnail.
            if self._snapshot:
                archive.addContentType(extension = "png", mime_type = "image/png")
                thumbnail = archive.getStream("/Metadata/thumbnail.png")
    
                thumbnail_buffer = QBuffer()
                thumbnail_buffer.open(QBuffer.ReadWrite)
                thumbnail_image = self._snapshot
                thumbnail_image.save(thumbnail_buffer, "PNG")
    
                thumbnail.write(thumbnail_buffer.data())
                archive.addRelation(virtual_path = "/Metadata/thumbnail.png", relation_type = "http://schemas.openxmlformats.org/package/2006/relationships/metadata/thumbnail", origin = "/3D/model.gcode")
            else:
                Logger.log("d", "Thumbnail not created, cannot save it")

     

  • Link to post
    Share on other sites

    • 7 months later...
    Posted (edited) · Get reference pictures from preview to learn a neural network

    Hello

     

    I'm certainly missing something but I d'ont know what ? I'm trying to use the same code to create a SnapShot .. I can get a file but the result is absolutly not correct. If Im using the plugin UPFWriter I got the result thumbnail.png  ( the first image) and my result is my_thumbnail.png  (the second image) ?

     

    Actual source code 

    # Copyright (c) 2020
    # The SimpleShapes plugin is released under the terms of the AGPLv3 or higher.
    
    from PyQt5.QtCore import QObject
    from PyQt5.QtCore import QBuffer
    
    from UM.Extension import Extension
    from cura.CuraApplication import CuraApplication
    from cura.Snapshot import Snapshot
    from cura.Utils.Threading import call_on_qt_thread
    
    from UM.Application import Application
    from UM.Logger import Logger
    from UM.Message import Message
    from UM.MimeTypeDatabase import MimeTypeDatabase, MimeType
    
    from UM.i18n import i18nCatalog
    catalog = i18nCatalog("cura")
    
    class CreateSnapShot(Extension, QObject,):
        
        def __init__(self, parent = None) -> None:
            QObject.__init__(self, parent)
            Extension.__init__(self)
    
            # attention pas le même nom que le menu
            self.addMenuItem(catalog.i18nc("@item:inmenu", "Snap"), self.doExtendedCreateTrainingPics)
            
            self._snapshot = None
           
     
        def doExtendedCreateTrainingPics(self):
            self.doCreateTrainingdata(True)
    
        #def doExtendedWrite(self):
        #   self.write(stream)
        @call_on_qt_thread
        def doCreateTrainingdata(self, extended_mode):
            self._write()
            self._message = Message(catalog.i18nc("@info:status", "Creating .PNG pics"), title = catalog.i18nc("@title", "PNG Pics"))
            self._message.show()
            
            
        def _createSnapshot(self, *args):
            # must be called from the main thread because of OpenGL
            Logger.log("d", "Creating thumbnail image...")
            try:
                self._snapshot = Snapshot.snapshot(width = 300, height = 300)
            except Exception:
                Logger.logException("w", "Failed to create snapshot image")
                self._snapshot = None
                
        @call_on_qt_thread
        def _write(self):
    
            self._createSnapshot()
    
            #Store the thumbnail.
            if self._snapshot:
                
                thumbnail_image = self._snapshot
                thumbnail_image.save("C:/temp/thumbnail.png", "PNG")
                Logger.log("d", "Thumbnail creation")
                
            else:
                Logger.log("d", "Thumbnail not created, cannot save it")
                

     

    thumbnail.png

    my_thumbnail.png

    Edited by Cuq
  • Link to post
    Share on other sites

    Posted · Get reference pictures from preview to learn a neural network

    This puzzled me a bit as well, it seems that the snapshot doesn't work correctly.

    So for now you can get it to work by creating a job and creating the snapshots inside that job.

  • Link to post
    Share on other sites

    Posted (edited) · Get reference pictures from preview to learn a neural network
    1 hour ago, nallath said:

    This puzzled me a bit as well, it seems that the snapshot doesn't work correctly.
     

     

    Thanks for your answer @nallath; I don't think it's an issue from the function snapshot. Because if I use the UFP Writer plugin , the result is correct ( first ) image.  And normaly my code start from the same source .. But there is a real problem when I write the PNG file.

     

    I'm thinking of an initialization that isn't done or a difference between the Save and Write method used in the UFP plugin code. But I can't figure out what's missing.

     

    thumbnail_buffer = QBuffer()
    thumbnail_buffer.open(QBuffer.ReadWrite)
    thumbnail_image = self._snapshot
    thumbnail_image.save(thumbnail_buffer, "PNG")
    
    thumbnail.write(thumbnail_buffer.data())

    versus

    thumbnail_image = self._snapshot
    thumbnail_image.save("C:/temp/thumbnail.png", "PNG")

     

    Edited by Cuq
  • Link to post
    Share on other sites

    Posted · Get reference pictures from preview to learn a neural network

    tested also with :       

    thumbnail_buffer = QBuffer()
    thumbnail_buffer.open(QBuffer.ReadWrite)
    thumbnail_image = self._snapshot
    thumbnail_image.save(thumbnail_buffer, "PNG")
    filehandle = open(Filename, 'w+b')
    filehandle.write(thumbnail_buffer.data())
    filehandle.close()

    Same result 

  • Link to post
    Share on other sites

    Posted · Get reference pictures from preview to learn a neural network

    The write UFP gode is called from a job, which means it's started from a non main thread. The @call_on_qt_thread ensures that it is later called on the main thread.

     

    The problem seems to be that when the snapshot is directly called from the main thread, it gives the weird results. That's also why i suggested the job work around. So to get this to work:

    Create a "WriteScreenshotJob" (subclass off Job), let the screenhot take happen in the run and ensure that the snapshot bit is run in a function with the call_on_qt_thread decorator.

  • Link to post
    Share on other sites

    Posted (edited) · Get reference pictures from preview to learn a neural network

    Thanks for your answer .. but your explanation are far away from my knowledges in Python .. So I will wait a wild and expted to find one day such type of sample code on Github 🙂

    Edited by Cuq
  • Link to post
    Share on other sites

    Posted (edited) · Get reference pictures from preview to learn a neural network

    tested with 

    class InterCallObject:
        def __init__(self):
            self.finish_event = threading.Event()
            self.result = None
    
    
    def call_on_qt_thread(func):
        def _call_on_qt_thread_wrapper(*args, **kwargs):
            def _handle_call(ico, *args, **kwargs):
                ico.result = func(*args, **kwargs)
                ico.finish_event.set()
            
            inter_call_object = InterCallObject()
            new_args = tuple([inter_call_object] + list(args)[:])
            Logger.log("d", "new_args = %s", new_args)
            CuraApplication.getInstance().callLater(_handle_call, *new_args, **kwargs)
            inter_call_object.finish_event.wait()
            return inter_call_object.result
        return _call_on_qt_thread_wrapper

     

    but   inter_call_object.finish_event.wait()   enter in a infinite loop

     

    Edited by Cuq
  • Link to post
    Share on other sites

    Posted · Get reference pictures from preview to learn a neural network
    On 5/29/2020 at 12:06 PM, Cuq said:

    Thanks for your answer .. but your explanation are far away from my knowledges in Python .. So I will wait a wild and expted to find one day such type of sample code on Github 🙂

     

    Never Give up !  3 years later. : https://github.com/5axes/CuraSettingsWriter

     

    image.thumb.png.4dad89e1f254af5fbf8b3759268bd88b.png

     

     

    At the same time I learned that you can embedded the image directly in the HTML file.

     

    • Like 2
    Link to post
    Share on other sites

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
    • Our picks

      • UltiMaker Cura 5.3 stable released
        In this stable release, Cura 5.3 achieves yet another huge leap forward in 3D printing thanks to material interlocking! As well as introducing an expanded recommended print settings menu and lots of print quality improvements. Not to mention, a whole bunch of new printer profiles for non-UltiMaker printers!
          • Thanks
          • Like
        • 30 replies
      • Here it is. The new UltiMaker S7
        The UltiMaker S7 is built on the success of the UltiMaker S5 and its design decisions were heavily based on feedback from customers.
         
         
        So what’s new?
        The obvious change is the S7’s height. It now includes an integrated Air Manager. This filters the exhaust air of every print and also improves build temperature stability. To further enclose the build chamber the S7 only has one magnetically latched door.
         
        The build stack has also been completely redesigned. A PEI-coated flexible steel build plate makes a big difference to productivity. Not only do you not need tools to pop a printed part off. But we also don’t recommend using or adhesion structures for UltiMaker materials (except PC, because...it’s PC). Along with that, 4 pins and 25 magnets make it easy to replace the flex plate perfectly – even with one hand.
         
        The re-engineered print head has an inductive sensor which reduces noise when probing the build plate. This effectively makes it much harder to not achieve a perfect first layer, improving overall print success. We also reversed the front fan direction (fewer plastic hairs, less maintenance), made the print core door magnets stronger, and add a sensor that helps avoid flooding.
         

         
        The UltiMaker S7 also includes quality of life improvements:
        Reliable bed tilt compensation (no more thumbscrews) 2.4 and 5 GHz Wi-Fi A 1080p camera (mounted higher for a better view) Compatibility with 280+ Marketplace materials Compatibility with S5 project files (no reslicing needed) And a whole lot more  
        Curious to see the S7 in action?
        We’re hosting a free tech demo on February 7.
        It will be live and you can ask any questions to our CTO, Miguel Calvo.
        Register here for the Webinar
          • Like
        • 18 replies
      • UltiMaker Cura Alpha 🎄 Tree Support Spotlight 🎄
        Are you a fan of tree support, but dislike the removal process and the amount of filament it uses? Then we would like to invite you to try this special release of UltiMaker Cura. Brought to you by our special community contributor @thomasrahm
         
        We generated a special version of Cura 5.2 called 5.3.0 Alpha + Xmas. The only changes we introduced compared to UltiMaker Cura 5.2.1 are those which are needed for the new supports. So keep in mind, this is not a sneak peek for Cura 5.3 (there are some really cool new features coming up) but a spotlight release highlighting this new version of tree supports.  
          • Like
        • 22 replies
    ×
    ×
    • Create New...