<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>SREF on KnightLi Blog</title>
        <link>https://www.knightli.com/en/tags/sref/</link>
        <description>Recent content in SREF on KnightLi Blog</description>
        <generator>Hugo -- gohugo.io</generator>
        <language>en</language>
        <lastBuildDate>Sun, 17 May 2026 20:20:51 +0800</lastBuildDate><atom:link href="https://www.knightli.com/en/tags/sref/index.xml" rel="self" type="application/rss+xml" /><item>
        <title>Midjourney May 2026 Update: Conversational Mode, AI-Assisted Development, and SREF Organization</title>
        <link>https://www.knightli.com/en/2026/05/17/midjourney-2026-05-office-hours-conversational-mode/</link>
        <pubDate>Sun, 17 May 2026 20:20:51 +0800</pubDate>
        
        <guid>https://www.knightli.com/en/2026/05/17/midjourney-2026-05-office-hours-conversational-mode/</guid>
        <description>&lt;p&gt;The most important signal from Midjourney&amp;rsquo;s May 14, 2026 Office Hours is not a single model parameter. It is that the product is continuing to move from &amp;ldquo;type a prompt and generate an image&amp;rdquo; toward a more conversational, organized, and iterative creative system.&lt;/p&gt;
&lt;p&gt;The information comes from a Japanese summary of Midjourney&amp;rsquo;s recent Q&amp;amp;A, covering conversational mode, AI-assisted development, website redesign, SREF and tag organization, Omni-reference, multi-character consistency, and how the team itself uses Midjourney.&lt;/p&gt;
&lt;p&gt;In one sentence: Midjourney is making image generation feel more like a creative system that can be discussed with, organized, and iterated over.&lt;/p&gt;
&lt;h2 id=&#34;conversational-mode-is-becoming-more-important&#34;&gt;Conversational mode is becoming more important
&lt;/h2&gt;&lt;p&gt;The most direct change is Conversational Mode.&lt;/p&gt;
&lt;p&gt;In the past, using Midjourney still depended heavily on parameters and fixed syntax. You had to remember rules for aspect ratio, image references, style references, model parameters, and then write them into prompts or adjust them in the interface.&lt;/p&gt;
&lt;p&gt;The direction of the new conversational mode is to let users describe these settings in more natural language.&lt;/p&gt;
&lt;p&gt;For example, users can specify by voice or text:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Default parameters.&lt;/li&gt;
&lt;li&gt;Aspect ratio, such as &lt;code&gt;16:9&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Image references.&lt;/li&gt;
&lt;li&gt;Style references, or &lt;code&gt;--sref&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Omni-reference in V7.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This shows Midjourney is not only improving generation quality. It is also reducing the operational cost of parameters.&lt;/p&gt;
&lt;p&gt;For ordinary users, the biggest change is that they do not have to memorize commands all the time. For heavy users, if conversational mode becomes stable enough, it may become the main entry point for adjusting generation settings with natural language.&lt;/p&gt;
&lt;h2 id=&#34;ai-assisted-development-is-changing-midjourneys-iteration-speed&#34;&gt;AI-assisted development is changing Midjourney&amp;rsquo;s iteration speed
&lt;/h2&gt;&lt;p&gt;Another interesting point is that the Midjourney team is using AI-assisted development at large scale internally.&lt;/p&gt;
&lt;p&gt;The source notes that the team can now fix small bugs, interface friction, and workflow issues much faster. There was even an example where a product bug was identified during a user call, fixed in real time with AI assistance, reviewed, and deployed quickly.&lt;/p&gt;
&lt;p&gt;This is more interesting than simply saying &amp;ldquo;AI helps engineers write code.&amp;rdquo;&lt;/p&gt;
&lt;p&gt;It shows that AI development tools are starting to influence how AI products themselves iterate:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;User feedback can enter the fix pipeline faster.&lt;/li&gt;
&lt;li&gt;Small experience issues are easier to address.&lt;/li&gt;
&lt;li&gt;Engineers can spend more energy on architecture, review, design decisions, and testing.&lt;/li&gt;
&lt;li&gt;Product teams can clean up edge cases more frequently.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Midjourney has many creative paths, parameter combinations, mobile experiences, search features, and organization workflows. Many issues are not about the core model failing to generate images, but about an entry point being awkward, an operation taking one extra step, or an edge state being unpleasant.&lt;/p&gt;
&lt;p&gt;AI-assisted development is especially good at accelerating these many small improvements.&lt;/p&gt;
&lt;h2 id=&#34;the-website-redesign-is-about-workflow-not-removing-features&#34;&gt;The website redesign is about workflow, not removing features
&lt;/h2&gt;&lt;p&gt;The Office Hours also mentioned a large website redesign.&lt;/p&gt;
&lt;p&gt;The goal is not to remove complex features, but to make the creative flow more intuitive, make onboarding easier, and organize tools and features more clearly.&lt;/p&gt;
&lt;p&gt;That matters.&lt;/p&gt;
&lt;p&gt;Midjourney&amp;rsquo;s problem is not a lack of features. As features grow, entry points, collections, organization, references, exploration, and reuse become more complex. For light users, the hard question is &amp;ldquo;where do I start?&amp;rdquo; For heavy users, the hard question is &amp;ldquo;how do I manage many styles, references, and experiment results?&amp;rdquo;&lt;/p&gt;
&lt;p&gt;Possible rollout strategies include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Offering old and new interfaces in parallel.&lt;/li&gt;
&lt;li&gt;Starting with an alpha test.&lt;/li&gt;
&lt;li&gt;Moving gradually to avoid disrupting heavy users.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These strategies suggest the team understands that Midjourney is not just a casual image toy. Many users have already integrated it into real creative workflows, so interface changes cannot casually break existing habits.&lt;/p&gt;
&lt;h2 id=&#34;sref-styles-and-tags-remain-pain-points&#34;&gt;SREF, styles, and tags remain pain points
&lt;/h2&gt;&lt;p&gt;SREF and style organization were among the most interesting topics in the Q&amp;amp;A.&lt;/p&gt;
&lt;p&gt;Users want better organization systems, especially for:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Random SREF.&lt;/li&gt;
&lt;li&gt;Style references.&lt;/li&gt;
&lt;li&gt;Saved aesthetics.&lt;/li&gt;
&lt;li&gt;Tags and colored tags.&lt;/li&gt;
&lt;li&gt;Stronger filtering, grouping, and reuse.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;But the team also raised a question: if the current folder system already lets one image belong to multiple folders, supports unlimited folders, and offers filtering and sorting, what exactly do tags provide that folders cannot?&lt;/p&gt;
&lt;p&gt;That question is practical.&lt;/p&gt;
&lt;p&gt;Many products add tags because users say they want tags. But a poorly designed tag system becomes another messy classification layer. If folders, tags, favorites, search, filters, projects, and style libraries have unclear boundaries, the system becomes harder to manage.&lt;/p&gt;
&lt;p&gt;So the Midjourney team wants concrete workflow examples: in which scenario do users need tags? Why are folders not enough? Is it for combining styles quickly, reusing across projects, filtering by theme, color tone, photography style, or character relationship?&lt;/p&gt;
&lt;p&gt;For Midjourney, the organization system may become as important as the generation model. Once users create long-term projects, the hard part is not generating one image, but managing thousands of images, hundreds of style directions, and repeated experiments.&lt;/p&gt;
&lt;h2 id=&#34;omni-reference-points-toward-more-complex-character-control&#34;&gt;Omni-reference points toward more complex character control
&lt;/h2&gt;&lt;p&gt;The source also mentioned that future Omni-reference / subject reference systems may support multiple character references at once and better separation of different subjects.&lt;/p&gt;
&lt;p&gt;This maps directly to a long-running pain point in AI image generation: character consistency and multi-character relationships.&lt;/p&gt;
&lt;p&gt;Keeping one character consistent is already difficult. Multiple characters are harder. Common problems include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Character A&amp;rsquo;s traits leaking onto character B.&lt;/li&gt;
&lt;li&gt;Identity confusion between multiple people.&lt;/li&gt;
&lt;li&gt;Clothing, hair, and facial features changing across images.&lt;/li&gt;
&lt;li&gt;Reference images influencing the whole style too strongly instead of controlling only the subject.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;If Omni-reference can handle subject separation better, Midjourney becomes more useful for comics, storyboards, advertising visuals, character design, game concept art, and continuous narratives.&lt;/p&gt;
&lt;p&gt;This is one of the areas worth watching after V7.&lt;/p&gt;
&lt;h2 id=&#34;midjourney-is-rethinking-prompts&#34;&gt;Midjourney is rethinking prompts
&lt;/h2&gt;&lt;p&gt;The summary includes a useful idea: language is an imperfect compression layer for imagination.&lt;/p&gt;
&lt;p&gt;That sentence explains Midjourney&amp;rsquo;s product direction well.&lt;/p&gt;
&lt;p&gt;Many users assume AI image generation is mainly about writing longer and more precise prompts. But in real creative work, image references, style references, moodboards, SREF, variations, regeneration, and post-processing are often more useful than a very long text prompt.&lt;/p&gt;
&lt;p&gt;Team member Duncan&amp;rsquo;s workflow reflects this. He reportedly treats Midjourney as a sketchbook, combining moodboards, SREF, short prompts, high &lt;code&gt;--r&lt;/code&gt; regeneration, strong and subtle variations, Photoshop retouching, and external upscaling workflows.&lt;/p&gt;
&lt;p&gt;This shows mature Midjourney users do not work only through &amp;ldquo;magic prompts.&amp;rdquo;&lt;/p&gt;
&lt;p&gt;A more realistic process is:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Use a small amount of language to set direction.&lt;/li&gt;
&lt;li&gt;Use image references to provide visual context.&lt;/li&gt;
&lt;li&gt;Use SREF to narrow the style.&lt;/li&gt;
&lt;li&gt;Use many variations to explore the space.&lt;/li&gt;
&lt;li&gt;Use human taste to select results.&lt;/li&gt;
&lt;li&gt;Use external tools for post-processing.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Prompts still matter, but they are not everything.&lt;/p&gt;
&lt;h2 id=&#34;what-this-means-for-users&#34;&gt;What this means for users
&lt;/h2&gt;&lt;p&gt;If you only generate images occasionally, the most direct impact is that conversational mode should become easier to use. In the future, you may be able to describe desired aspect ratio, references, style, and parameters more naturally instead of memorizing commands.&lt;/p&gt;
&lt;p&gt;If you are a heavy user, three areas deserve attention.&lt;/p&gt;
&lt;p&gt;First, organization.&lt;/p&gt;
&lt;p&gt;How SREF, styles, folders, favorites, and tags evolve will directly affect long-term creative efficiency.&lt;/p&gt;
&lt;p&gt;Second, the website redesign.&lt;/p&gt;
&lt;p&gt;If the new interface can connect exploration, organization, reuse, and export, Midjourney will feel more like a professional creative tool instead of a single generator.&lt;/p&gt;
&lt;p&gt;Third, character and subject reference.&lt;/p&gt;
&lt;p&gt;If Omni-reference can reliably handle multiple characters and subject separation, Midjourney becomes better suited for continuous projects rather than only single images.&lt;/p&gt;
&lt;h2 id=&#34;summary&#34;&gt;Summary
&lt;/h2&gt;&lt;p&gt;The key point from Midjourney&amp;rsquo;s May 2026 Office Hours is not one flashy parameter. It is that the product is continuing to evolve toward a creative system.&lt;/p&gt;
&lt;p&gt;Conversational mode lowers the input barrier. AI-assisted development increases iteration speed. The website redesign aims to reorganize workflows. SREF and tag discussions point to long-term asset management. Omni-reference relates to character consistency and complex subject control.&lt;/p&gt;
&lt;p&gt;For AI image generation tools, model capability is obviously important. But once generation quality reaches a certain level, what determines whether users stay long term is often workflow, organization, controllability, and iteration speed.&lt;/p&gt;
&lt;p&gt;Midjourney is filling in those pieces.&lt;/p&gt;
&lt;h2 id=&#34;references&#34;&gt;References
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;https://note.com/akisuke0925/n/nc9e099d9c77f&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Midjourney 最新ニュース（2026年5月14 日）｜アキスケ&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description>
        </item>
        
    </channel>
</rss>
