Mastering Midjourney v7: The Ultimate Guide to New Parameters & Features for Designers
For designers, illustrators, and creative teams, the days of "prompt and hope for the best" are over. With Midjourney v7, you get a powerful set of parameters to lock in styles, maintain character consistency, and fine-tune every detail with precision.
And the best part? You can use these tools directly in Artflo. Whether you're building a consistent IP for a graphic novel, iterating on UI concepts, or experimenting with new visual styles, Artflo helps you integrate Midjourney v7 into your workflow seamlessly—so AI generation becomes a reliable creative partner, not a guessing game.
Core Parameters for Mastering MJ v7
With Midjourney v7, the focus has shifted from describing everything in words to guiding the AI with reference images. While character-specific references are no longer supported, you can now control your outputs with two powerful tools: Omni‑Reference, which brings elements from reference images into your scene, and Style Reference, which applies a chosen style consistently across generations.
Omni Reference (--oref)
Literally, Omni‑Reference means "all‑purpose reference." According to Midjourney's official description, Omni‑Reference can be understood as a feature that lets you "add specified elements into an image," whether it's a character, object, vehicle, or non-human creature. Essentially, it's a universal subject reference generator. Omni‑Reference supports stylization and can be used in combination with personalization, style references, and moodboards.
- How to Use Omni Reference in Artflo: After uploading a reference image to the Input node, type --oref in your prompt and then select the reference image you want to use from the list that appears.
- Pro Tip: You can also use --ow to control how strictly the reference image is applied, with a range from 0 to 1000 (default 100).
--ow 25 – Ideal for style transformations, like turning a photo into a cartoon.
--ow 400 – Keeps key details consistent, such as a character's face or clothing.
--ow 1000 – Almost fully replicates the reference image, though the result may look rigid.
This gives you precise control over how much influence the reference image has on the final output, letting you balance creativity and fidelity. It's important to note that the --stylize and --exp parameters also interact with Omni‑Reference, competing for influence over the final image. As a result, you'll need to adjust the Omni‑Reference weight accordingly when modifying these settings.
Style Reference (--sref)
In earlier versions, adding the --sref random suffix would generate a random style code. The system actually has 2³² (over 4.2 billion) possible style codes, and a big part of the Midjourney community's fun was discovering, testing, and sharing these codes.
However, v7 introduces a completely new style reference system, which changes how style codes work. Style codes from v6.1 no longer carry over to v7. To make older codes work in v7, you'd need to specify the style version, for example, --sv 4. Now, v7 uses its own default style reference system—--sv 6—which is already applied in the default raw image model. This new system offers a fresh framework for experimenting with styles while keeping outputs consistent and predictable.
You can find style codes shared by others here: Midjourney --sref codes.
- How to Use Style Reference in Artflo
- Generate a random style code: In the Input prompt, type --sref random. Midjourney will randomly select a preset style from its internal library. After submitting the prompt, random is converted into a numeric style code (sref code), which you can reuse in future prompts to apply the same style consistently.
- Apply a style from a reference image: Upload a reference image to the Input node, type --sref in your prompt, and then select the reference image you want to use from the list that appears.
- Use a specific style code: Type --sref followed by a style code in your Input prompt. The generated result will adopt the style corresponding to that code. For example: --sref 364111995 --sv 4. Here, --sv specifies the style version. In v7, the default style reference system is --sv 6, so specifying an older version (like --sv 4) is only needed if you want to use legacy style codes.
- Pro Tip: You can also use --sw to adjust how strongly a selected style reference influences the final output. The value ranges from 0 to 1000 (default is 100), giving you granular control over how aggressively the style is applied.
--sw 25 – Subtle style influence. Applies the reference style lightly while preserving most of the original prompt's look and structure. Ideal when you want a gentle stylistic touch without transforming the scene.
--sw 400 – Balanced, intentional style application. Keeps the output aligned with your chosen style while still allowing prompts and composition to contribute meaningfully. Great for consistent art direction, character design, and world-building.
--sw 1000 – Strong, dominant stylization. Forces the style reference to heavily dictate the final image. Useful for strong aesthetic transformations, but may override some prompt details or reduce variation between outputs.
Personalization (--p)
Midjourney can learn the types of images you like and use that information to create your Global Personalization Profile, generating custom images that reflect your personal taste and unique style. You can also create additional profiles to focus on specific styles or explore different aesthetics.
Your personalization profiles evolve as you like and rank more images. Each time your profile updates, it generates new codes that act like labels, helping Midjourney apply the right style and preferences to different versions of your creations.
- How to Use Personalization in Artflo: Add --p to the end of your prompt, and Midjourney will apply your personal bias to the generation. For example: Generate portrait of a cat --p xjspemh.
Experimental parameter (--exp)
Midjourney v7 introduces an experimental parameter called --exp, which stands for experimental image aesthetics. This setting works alongside --stylize, but instead of just controlling how much artistic interpretation the model applies, --exp adds another dimension of visual expression—enhancing detail, dynamism, and creative flair in your images.
- How to Use Experimental Parameter in Artflo: Add --exp to the end of your prompt, and set an appropriate value.
- Pro Tip: You can use --exp with a value from 0 to 100 (default is 0). Higher values tend to increase image complexity and tone richness, but they can also reduce prompt accuracy and diversity if pushed too far.
Lower values (e.g., 5 or 10) – Add subtle enhancements without overwhelming your prompt's intent.
Moderate values (e.g., 25 or 50) – Produce more noticeable increases in detail and visual energy, ideal for artistic or textured results.
Very high values (e.g., 100) – Push the aesthetics to an extreme, but may overpower other parameters like --stylize and even reduce how closely the image follows your prompt.
Because --exp influences creativity and visual style, it's often best used in combination with other parameters—but with care. Extremely high values may dominate the creative effects and compete with settings like --stylize or personalization, so start with mid‑range values and adjust based on your needs.
Other Common Parameter List
| Parameter | Description | Use Case |
|---|---|---|
| --ar | Sets image width-to-height ratio. Default is 1:1. Available aspect ratios include 1:1, 16:9, 9:16, 4:3, 3:2, and 2:1 | A futuristic cityscape --ar 1:1 |
| --no | Tells Midjourney what not to include in your image. Add multiple items separated by commas. Using --no is more effective than saying "without" or "don't" in the prompt. Note: --no item1, item2 is equivalent to using a negative weight in multi-prompting (item1::-.5, item2::-.5) | Nature-inspired forest landscape design, full of greenery and sunlight, --no animals |
| --q | Determines rendering time and quality. Higher values consume more resources. Default is 1 | Futuristic architectural design with metallic and glass materials, highly detailed, --q 2 |
| --seed | Sets starting point for generation. Same seed and prompt produces similar images | Vintage-style ceramic cup design with traditional hand-painted patterns, --seed 12345 |
| --v | Selects the Midjourney algorithm version | Modern minimalist café design, featuring white and gray tones, smooth lines, --v 6.1 |
| --iw | Adjusts influence of image prompts vs text prompts. Default is 1 | A woman with long flowing hair stood under a tree, --iw 1.5 |
| --s | Controls strength of Midjourney's default aesthetic style | Colorful risograph of a fig --s 100 |
| --tile | Generates images suitable for seamless patterns | Geometric pattern ceramic tile design, featuring deep blue and gold tones, --tile |
| --raw | Applies alternative aesthetics. Raw offers less automatic beautification | A photo of a orange cat --raw |
3 Ways Designers Are Using MJ v7
High-Fidelity Product Photography
A common problem in AI image generation is keeping products looking exactly the same. Clients often want to place their real product bottle into a new background, but many AI models change the label, distort the text, or even alter the bottle's shape. With v7, this problem is solved using Omni-Reference (--oref), which allows the AI to closely follow the original product image and keep key details accurate while generating new scenes.
- How to Complete This Workflow in Artflo Using Midjourney v7
- Upload a clean photo of the product (white background works best) in Input node. Type --oref and choose the photo.
- Use --oref and set the --ow to High (approx. 300-500).
- Use --exp 10 to add realistic micro-textures (like glass imperfections or paper grain) so it doesn't look like a plastic render.
- The Prompt: Commercial photography, luxury perfume bottle on a rock in the ocean, splashing water, golden hour sunlight, 8k resolution --oref [Product Image] --exp 10 --ar 4:5
Create a Interior Design Moodboard
Imagine you're preparing a proposal for a client who wants a bohemian-style room full of vibrant colors, layered textures, and eclectic decor. Instead of piecing images together manually, Midjourney v7 can create a harmonized moodboard.
- How to Complete This Workflow in Artflo Using Midjourney v7
- Create an Input node to type in your prompt.
- Add realistic micro-textures by using --exp 10. This helps details like fabrics, wood grain, or subtle lighting imperfections appear more natural, avoiding a flat or "plastic" look.
- Set the image aspect ratio with --ar. This ensures your generated images match the layout you want, such as 16:9 for wide room shots or 1:1 for square previews.
- The Prompt: Cinematic shot of a cozy bohemian living room, eclectic patterns everywhere, explosion of vibrant colors, lush indoor jungle, moroccan lanterns casting intricate shadows, dreamlike atmosphere, volumetric lighting, rich textures, cozy and inviting, detailed interior design render --ar 16:9 --chaos 20 --stylize 500 --v 6.1
Create Architectural Posters
In architectural visualization, nothing grabs attention like a striking, minimalist poster. With Midjourney v7 in Artflo, designers and architects can generate high-fidelity posters of buildings in minutes—no 3D modeling or complex rendering required.
- How to Complete This Workflow in Artflo Using Midjourney v7
- If you want to replicate a specific building, upload a photo. The reference image helps Midjourney v7 accurately capture the structure, materials, and lighting.
- Enter your architectural poster prompt, using --oref to reference the uploaded image (if any), and --exp 10 to enhance textures.
- Use --ar to define the poster format. For example, 3:4 for vertical posters suitable for print or social media, or 16:9 for panoramic views.
- The Prompt: Architectural poster of a futuristic skyscraper, sunset lighting, dramatic shadows, cinematic perspective, --exp 10 --ar 9:16 --v 7